WorldWideScience

Sample records for analytical tool development

  1. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  2. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  3. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  4. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  5. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  6. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  7. Development of the oil spill response cost-effectiveness analytical tool

    International Nuclear Information System (INIS)

    Etkin, D.S.; Welch, J.

    2005-01-01

    Decision-making during oil spill response operations or contingency planning requires balancing the need to remove as much oil as possible from the environment with the desire to minimize the impact of response operations on the environment they are intended to protect. This paper discussed the creation of a computer tool developed to help in planning and decision-making during response operations. The Oil Spill Response Cost-Effectiveness Analytical Tool (OSRCEAT) was developed to compare the costs of response with the benefits of response in both hypothetical and actual oil spills. The computer-based analytical tool can assist responders and contingency planners in decision-making processes as well as act as a basis of discussion in the evaluation of response options. Using inputs on spill parameters, location and response options, OSRCEAT can calculate response cost, costs of environmental and socioeconomic impacts of the oil spill and response impacts. Oil damages without any response are contrasted to oil damages with response, with expected improvements. Response damages are subtracted from the difference in damages with and without response in order to derive a more accurate response benefit. An OSRCEAT user can test various response options to compare potential benefits in order to maximize response benefit. OSRCEAT is best used to compare and contrast the relative benefits and costs of various response options. 50 refs., 19 tabs., 2 figs

  8. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  9. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  11. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  12. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  13. FIGURED WORLDS AS AN ANALYTIC AND METHODOLOGICAL TOOL IN PROFESSIONAL TEACHER DEVELOPMENT

    DEFF Research Database (Denmark)

    Møller, Hanne; Brok, Lene Storgaard

    . Hasse (2015) and Holland (1998) have inspired our study; i.e., learning is conceptualized as a social phenomenon, implying that contexts of learning are decisive for learner identity. The concept of Figured Worlds is used to understand the development and the social constitution of emergent interactions......,“(Holland et al., 1998, p. 52) and gives a framework for understanding meaning-making in particular pedagogical settings. We exemplify our use of the term Figured Worlds, both as an analytic and methodological tool for empirical studies in kindergarten and school. Based on data sources, such as field notes...

  14. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  15. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran; Jonsson, Daniel K.; Lundberg, Kristina; Tyskeng, Sara; Wallgren, Oskar

    2009-01-01

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  16. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  17. Method for effective usage of Google Analytics tools

    Directory of Open Access Journals (Sweden)

    Ирина Николаевна Егорова

    2016-01-01

    Full Text Available Modern Google Analytics tools have been investigated against effective attraction channels for users and bottlenecks detection. Conducted investigation allowed to suggest modern method for effective usage of Google Analytics tools. The method is based on main traffic indicators analysis, as well as deep analysis of goals and their consecutive tweaking. Method allows to increase website conversion and might be useful for SEO and Web analytics specialists

  18. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  19. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Drachsler, Hendrik; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  20. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  1. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  2. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: a.halog@uq.edu.au [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)

    2017-12-20

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  3. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.

    2017-01-01

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  4. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  5. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  6. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  7. Process analytical technology (PAT) tools for the cultivation step in biopharmaceutical production

    NARCIS (Netherlands)

    Streefland, M.; Martens, D.E.; Beuvery, E.C.; Wijffels, R.H.

    2013-01-01

    The process analytical technology (PAT) initiative is now 10 years old. This has resulted in the development of many tools and software packages dedicated to PAT application on pharmaceutical processes. However, most applications are restricted to small molecule drugs, mainly for the relatively

  8. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  9. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  10. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  11. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  12. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  13. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  14. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  17. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  18. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  19. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  20. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  1. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  2. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  3. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  4. Current research relevant to the improvement of γ-ray spectroscopy as an analytical tool

    International Nuclear Information System (INIS)

    Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.

    1976-01-01

    Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors

  5. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  6. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  7. Analytical Tools for the Routine Use of the UMAE

    International Nuclear Information System (INIS)

    D'Auria, F.; Eramo, A.; Gianotti, W.

    1998-01-01

    UMAE (Uncertainty Methodology based on Accuracy Extrapolation) is methodology developed to calculate the uncertainties related to thermal-hydraulic code results in Nuclear Power Plant transient analysis. The use of the methodology has shown the need of specific analytical tools to simplify some steps of its application and making clearer individual procedures adopted in the development. Three of these have been recently completed and are illustrated in this paper. The first one makes it possible to attribute ''weight factors'' to the experimental Integral Test Facilities; results are also shown. The second one deals with the calculation of the accuracy of a code results. The concerned computer program makes a comparison between experimental and calculated trends of any quantity and gives as output the accuracy of the calculation. The third one consists in a computer program suitable to get continuous uncertainty bands from single valued points. (author)

  8. Developing new chemical tools for solvent extraction

    International Nuclear Information System (INIS)

    Moyer, B.A.; Baes, C.F.; Burns, J.H.; Case, G.N.; Sachleben, R.A.; Bryan, S.A.; Lumetta, G.J.; McDowell, W.J.; Sachleben, R.A.

    1993-01-01

    Prospects for innovation and for greater technological impact in the field of solvent extraction (SX) seem as bright as ever, despite the maturation of SX as an economically significant separation method and as an important technique in the laboratory. New industrial, environmental, and analytical problems provide compelling motivation for diversifying the application of SX, developing new solvent systems, and seeking improved properties. Toward this end, basic research must be dedicated to enhancing the tools of SX: physical tools for probing the basis of extraction and molecular tools for developing new SX chemistries. In this paper, the authors describe their progress in developing and applying the general tools of equilibrium analysis and of ion recognition in SX. Nearly half a century after the field of SX began in earnest, coordination chemistry continues to provide the impetus for important advancements in understanding SX systems and in controlling SX chemistry. In particular, the physical tools of equilibrium analysis, X-ray crystallography, and spectroscopy are elucidating the molecular basis of SX in unprecedented detail. Moreover, the principles of ion recognition are providing the molecular tools with which to achieve new selectivities and new applications

  9. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  10. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  11. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  12. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  13. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  14. Learning Analytics: drivers, developments and challenges

    Directory of Open Access Journals (Sweden)

    Rebecca Ferguson

    2014-12-01

    Full Text Available Learning analytics is a significant area of Technology-Enhanced Learning (TEL that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.

  15. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).

  16. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  17. Developments in analytical instrumentation

    Science.gov (United States)

    Petrie, G.

    The situation regarding photogrammetric instrumentation has changed quite dramatically over the last 2 or 3 years with the withdrawal of most analogue stereo-plotting machines from the market place and their replacement by analytically based instrumentation. While there have been few new developments in the field of comparators, there has been an explosive development in the area of small, relatively inexpensive analytical stereo-plotters based on the use of microcomputers. In particular, a number of new instruments have been introduced by manufacturers who mostly have not been associated previously with photogrammetry. Several innovative concepts have been introduced in these small but capable instruments, many of which are aimed at specialised applications, e.g. in close-range photogrammetry (using small-format cameras); for thematic mapping (by organisations engaged in environmental monitoring or resources exploitation); for map revision, etc. Another innovative and possibly significant development has been the production of conversion kits to convert suitable analogue stereo-plotting machines such as the Topocart, PG-2 and B-8 into fully fledged analytical plotters. The larger and more sophisticated analytical stereo-plotters are mostly being produced by the traditional mainstream photogrammetric systems suppliers with several new instruments and developments being introduced at the top end of the market. These include the use of enlarged photo stages to handle images up to 25 × 50 cm format; the complete integration of graphics workstations into the analytical plotter design; the introduction of graphics superimposition and stereo-superimposition; the addition of correlators for the automatic measurement of height, etc. The software associated with this new analytical instrumentation is now undergoing extensive re-development with the need to supply photogrammetric data as input to the more sophisticated G.I.S. systems now being installed by clients, instead

  18. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    Science.gov (United States)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  19. An analytic framework for developing inherently-manufacturable pop-up laminate devices

    International Nuclear Information System (INIS)

    Aukes, Daniel M; Goldberg, Benjamin; Wood, Robert J; Cutkosky, Mark R

    2014-01-01

    Spurred by advances in manufacturing technologies developed around layered manufacturing technologies such as PC-MEMS, SCM, and printable robotics, we propose a new analytic framework for capturing the geometry of folded composite laminate devices and the mechanical processes used to manufacture them. These processes can be represented by combining a small set of geometric operations which are general enough to encompass many different manufacturing paradigms. Furthermore, such a formulation permits one to construct a variety of geometric tools which can be used to analyze common manufacturability concepts, such as tool access, part removability, and device support. In order to increase the speed of development, reduce the occurrence of manufacturing problems inherent with current design methods, and reduce the level of expertise required to develop new devices, the framework has been implemented in a new design tool called popupCAD, which is suited for the design and development of complex folded laminate devices. We conclude with a demonstration of utility of the tools by creating a folded leg mechanism. (paper)

  20. Development of simulation tools for improvement of measurement accuracy and efficiency in ultrasonic testing. Part 2. Development of fast simulator based on analytical approach

    International Nuclear Information System (INIS)

    Yamada, Hisao; Fukutomi, Hiroyuki; Lin, Shan; Ogata, Takashi

    2008-01-01

    CRIEPI developed a high speed simulation method to predict B scope images for crack-like defects under ultrasonic testing. This method is based on the geometrical theory of diffraction (GTD) to follow ultrasonic waves transmitted from the angle probe and with the aid of reciprocity relations to find analytical equations to express echoes received by the probe. The tip and mirror echoes from a slit with an arbitrary angle in the direction of thickness of test article and an arbitrary depth can be calculated by this method. Main object of the study is to develop a high speed simulation tool to gain B scope displays from the crack-like defect. This was achieved for the simple slits in geometry change regions by the prototype software based on the method. Fairy complete B scope images for slits could be obtained by about a minute on a current personal computer. The numerical predictions related to the surface opening slits were in excellent agreement with the relative experimental measurements. (author)

  1. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  2. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    International Nuclear Information System (INIS)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A ampersand PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A ampersand PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A ampersand PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A ampersand PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in

  3. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  4. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.

    1995-01-01

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  5. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han

    2012-01-01

    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors

  6. Kalisphera: an analytical tool to reproduce the partial volume effect of spheres imaged in 3D

    International Nuclear Information System (INIS)

    Tengattini, Alessandro; Andò, Edward

    2015-01-01

    In experimental mechanics, where 3D imaging is having a profound effect, spheres are commonly adopted for their simplicity and for the ease of their modeling. In this contribution we develop an analytical tool, ‘kalisphera’, to produce 3D raster images of spheres including their partial volume effect. This allows us to evaluate the metrological performance of existing image-based measurement techniques (knowing a priori the ground truth). An advanced application of ‘kalisphera’ is developed here to identify and accurately characterize spheres in real 3D x-ray tomography images with the objective of improving trinarization and contact detection. The effect of the common experimental imperfections is assessed and the overall performance of the tool tested on real images. (paper)

  7. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  8. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    Selecting the appropriate analytical methods for characterizing the assembly and morphology of polymer-based vesicles, or polymersomes are required to reach their full potential in biotechnology. This work presents and compares 17 different techniques for their ability to adequately report size....../purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  9. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  10. A Review on the Design Structure Matrix as an Analytical Tool for Product Development Management

    OpenAIRE

    Mokudai, Takefumi

    2006-01-01

    This article reviews fundamental concepts and analytical techniques of design structure matrix (DSM) as well as recent development of DSM studies. The DSM is a matrix representation of relationships between components of a complex system, such as products, development organizations and processes. Depending on targets of analysis, there are four basic types of DSM: Component-based DSM, Team-based DSM, Task-based DSM, and Parameter-based DSM. There are two streams of recent DSM studies: 1) ...

  11. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  12. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    Science.gov (United States)

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  13. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    Science.gov (United States)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  14. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    Science.gov (United States)

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  15. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  16. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  17. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998; ......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing....

  18. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  19. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated....... The polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus...... of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  20. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  1. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    /purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  2. Development of a Framework for Sustainable Outsourcing: Analytic Balanced Scorecard Method (A-BSC

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-06-01

    Full Text Available Nowadays, many enterprises choose to outsource its non-core business to other enterprises to reduce cost and increase the efficiency. Many enterprises choose to outsource their supply chain management (SCM and leave it to a third-party organization in order to improve their services. The paper proposes an integrated and multicriteria tool useful to monitor and to improve performance in an outsourced supply chain. The Analytic Balanced Scorecard method (A-BSC is proposed as an effective method useful to analyze strategic performance within an outsourced supply chain. The aim of the paper is to present the integration of two methodologies: Balanced Scorecard, a multiple perspective framework for performance assessment, and Analytic Hierarchy Process, a decision-making tool used to prioritize multiple performance perspectives and to generate a unified metric. The development of the framework is aimed to provide a performance analysis to achieve better sustainability performance of supply chain. A real case study concerning a typical value chain is presented.

  3. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  4. Electrochemical sensors: a powerful tool in analytical chemistry

    Directory of Open Access Journals (Sweden)

    Stradiotto Nelson R.

    2003-01-01

    Full Text Available Potentiometric, amperometric and conductometric electrochemical sensors have found a number of interesting applications in the areas of environmental, industrial, and clinical analyses. This review presents a general overview of the three main types of electrochemical sensors, describing fundamental aspects, developments and their contribution to the area of analytical chemistry, relating relevant aspects of the development of electrochemical sensors in Brazil.

  5. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    Science.gov (United States)

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  6. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  7. Quantitative high throughput analytics to support polysaccharide production process development.

    Science.gov (United States)

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  8. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  9. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    Science.gov (United States)

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  10. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    Directory of Open Access Journals (Sweden)

    Sund Björn

    2013-02-01

    Full Text Available Abstract Background Out-of-hospital cardiac arrest (OHCA is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1 static changes such as trimming the emergency call handling time or (2 dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  11. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  12. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Björklund, Anna

    2012-01-01

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  13. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    Science.gov (United States)

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  14. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  15. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    Directory of Open Access Journals (Sweden)

    Shane Dawson

    2014-09-01

    Full Text Available The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional ‘literacy’ skills towards an enhanced set of “multiliteracies” or “new media literacies”. Measuring the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is essential to determining the impact of compulsory education. The opportunity now is to develop tools to assess individual and societal attainment of these new literacies. Drawing on the work of Jenkins and colleagues (2006 and notions of a participatory culture, this paper proposes a conceptual framework for how learning analytics can assist in measuring individual achievement of multiliteracies and how this evaluative process can be scaled to provide an institutional perspective of the educational progress in fostering these fundamental skills.

  16. Program to develop analytical tools for environmental and safety assessment of nuclear material shipping container systems

    International Nuclear Information System (INIS)

    Butler, T.A.

    1978-11-01

    This paper describes a program for developing analytical techniques to evaluate the response of nuclear material shipping containers to severe accidents. Both lumped-mass and finite element techniques are employed to predict shipping container and shipping container-carrier response to impact. The general impact problem is computationally expensive because of its nonlinear, three-dimensional nature. This expense is minimized by using approximate models to parametrically identify critical cases before more exact analyses are performed. The computer codes developed for solving the problem are being experimentally substantiated with test data from full-scale and scale-model container drop tests. 6 figures, 1 table

  17. Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Dylan [Princeton Univ., NJ (United States); Miller, G. P. [Univ. of Tulsa, Tulsa, AZ (United States)

    2016-10-03

    This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physics determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.

  18. Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks

    International Nuclear Information System (INIS)

    Brennan, Dylan; Miller, G. P.

    2016-01-01

    This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physics determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.

  19. Development of a new analytical tool for assessing the mutagen 2-methyl-1,4-dinitro-pyrrole in meat products by LC-ESI-MS/MS.

    Science.gov (United States)

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; Yotsuyanagi, Suzana Eri; da Silva Correa Lemos, Ana Lucia; Joussef, Antonio Carlos; De Dea Lindner, Juliano

    2018-08-01

    The use of sorbate and nitrite in meat processing may lead to the formation of 2-methyl-1,4-dinitro-pyrrole (DNMP), a mutagenic compound. This work was aimed at developing and validating an analytical method for the quantitation of DNMP by liquid chromatography-tandem mass spectrometry. Full validation was performed in accordance to Commission Decision 2002/657/EC and method applicability was checked in several samples of meat products. A simple procedure, with low temperature partitioning solid-liquid extraction, was developed. The nitrosation during the extraction was monitored by the N-nitroso-DL-pipecolic acid content. Chromatographic separation was achieved in 8 min with di-isopropyl-3-aminopropyl silane bound to hydroxylated silica as stationary phase. Samples of bacon and cooked sausage yielded the highest concentrations of DNMP (68 ± 3 and 50 ± 3 μg kg -1 , respectively). The developed method proved to be a reliable, selective, and sensitive tool for DNMP measurements in meat products. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  1. Using fuzzy analytical hierarchy process (AHP to evaluate web development platform

    Directory of Open Access Journals (Sweden)

    Ahmad Sarfaraz

    2012-01-01

    Full Text Available Web development is plays an important role on business plans and people's lives. One of the key decisions in which both short-term and long-term success of the project depends is choosing the right development platform. Its criticality can be judged by the fact that once a platform is chosen, one has to live with it throughout the software development life cycle. The entire shape of the project depends on the language, operating system, tools, frameworks etc., in short the web development platform chosen. In addition, choosing the right platform is a multi criteria decision making (MCDM problem. We propose a fuzzy analytical hierarchy process model to solve the MCDM problem. We try to tap the real-life modeling potential of fuzzy logic and conjugate it with the commonly used powerful AHP modeling method.

  2. Developing a Code of Practice for Learning Analytics

    Science.gov (United States)

    Sclater, Niall

    2016-01-01

    Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organization that champions the use of digital technologies in UK education and research, has attempted to address this with the development of…

  3. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    Science.gov (United States)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  4. Analytical Techniques in the Pharmaceutical Sciences

    DEFF Research Database (Denmark)

    Leurs, Ulrike; Mistarz, Ulrik Hvid; Rand, Kasper Dyrberg

    2016-01-01

    Mass spectrometry (MS) offers the capability to identify, characterize and quantify a target molecule in a complex sample matrix and has developed into a premier analytical tool in drug development science. Through specific MS-based workflows including customized sample preparation, coupling...

  5. Prioritizing of effective factors on development of medicinal plants cultivation using analytic network process

    Directory of Open Access Journals (Sweden)

    Ghorbanali Rassam

    2014-07-01

    Full Text Available For the overall development of medicinal plants cultivation in Iran, there is a need to identify various effective factors on medicinal plant cultivation. A proper method for identifying the most effective factor on the development of the medicinal plants cultivation is essential. This research conducted in order to prioritizing of the effective criteria for the development of medicinal plant cultivation in North Khorasan province in Iran using Analytical Network Process (ANP method. The multi-criteria decision making (MCDM is suggested to be a viable method for factor selection and the analytic network process (ANP has been used as a tool for MCDM. For this purpose a list of effective factors offered to expert group. Then pair wise comparison questionnaires were distributed between relevant researchers and local producer experts of province to get their opinions about the priority of criteria and sub- criteria. The questionnaires were analyzed using Super Decision software. We illustrated the use of the ANP by ranking main effective factors such as economic, educational-extension services, cultural-social and supportive policies on development of medicinal plants. The main objective of the present study was to develop ANP as a decision making tool for prioritizing factors affecting the development of medicinal plants cultivation. Results showed that the ANP methodology was perfectly suited to tackling the complex interrelations involved in selection factor in this case. Also the results of the process revealed that among the factors, supporting the cultivation of medicinal plants, build the infrastructure for marketing support, having educated farmer and easy access to production input have most impact on the development of medicinal plant cultivation.

  6. Laser-induced plasma spectrometry: truly a surface analytical tool

    International Nuclear Information System (INIS)

    Vadillo, Jose M.; Laserna, J.

    2004-01-01

    For a long period, analytical applications of laser induced plasma spectrometry (LIPS) have been mainly restricted to overall and quantitative determination of elemental composition in bulk, solid samples. However, introduction of new compact and reliable solid state lasers and technological development in multidimensional intensified detectors have made possible the seeking of new analytical niches for LIPS where its analytical advantages (direct sampling from any material irrespective of its conductive status without sample preparation and with sensitivity adequate for many elements in different matrices) could be fully exploited. In this sense, the field of surface analysis could take advantage from the cited advantages taking into account in addition, the capability of LIPS for spot analysis, line scan, depth-profiling, area analysis and compositional mapping with a single instrument in air at atmospheric pressure. This review paper outlines the fundamental principles of laser-induced plasma emission relevant to sample surface studies, discusses the experimental parameters governing the spatial (lateral and in-depth) resolution in LIPS analysis and presents the applications concerning surface examination

  7. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  8. IBM’s Health Analytics and Clinical Decision Support

    Science.gov (United States)

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  9. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  10. ANALYTICAL MODEL FOR LATHE TOOL DISPLACEMENTS CALCULUS IN THE MANUFACTURING P ROCESS

    Directory of Open Access Journals (Sweden)

    Catălin ROŞU

    2014-01-01

    Full Text Available In this paper, we present an analytical model for lathe tools displacements calculus in the manufacturing process. We will present step by step the methodology for the displacements calculus and in the end we will insert these relations in a program for automatic calculus and we extract the conclusions. There is taken into account only the effects of the bending moments (because these insert the highest displacements. The simplifying assumptions and the calculus relations for the displacements (linea r and angular ones are presented in an original way.

  11. Prioritizing pesticide compounds for analytical methods development

    Science.gov (United States)

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  12. Ootw Tool Requirements in Relation to JWARS

    Energy Technology Data Exchange (ETDEWEB)

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  13. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  14. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  15. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  16. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  17. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    Science.gov (United States)

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  18. Analytical characterization using surface-enhanced Raman scattering (SERS) and microfluidic sampling

    International Nuclear Information System (INIS)

    Wang, Chao; Yu, Chenxu

    2015-01-01

    With the rapid development of analytical techniques, it has become much easier to detect chemical and biological analytes, even at very low detection limits. In recent years, techniques based on vibrational spectroscopy, such as surface enhanced Raman spectroscopy (SERS), have been developed for non-destructive detection of pathogenic microorganisms. SERS is a highly sensitive analytical tool that can be used to characterize chemical and biological analytes interacting with SERS-active substrates. However, it has always been a challenge to obtain consistent and reproducible SERS spectroscopic results at complicated experimental conditions. Microfluidics, a tool for highly precise manipulation of small volume liquid samples, can be used to overcome the major drawbacks of SERS-based techniques. High reproducibility of SERS measurement could be obtained in continuous flow generated inside microfluidic devices. This article provides a thorough review of the principles, concepts and methods of SERS-microfluidic platforms, and the applications of such platforms in trace analysis of chemical and biological analytes. (topical review)

  19. Nuclear analytical methods: Past, present and future

    International Nuclear Information System (INIS)

    Becker, D.A.

    1996-01-01

    The development of nuclear analytical methods as an analytical tool began in 1936 with the publication of the first paper on neutron activation analysis (NAA). This year, 1996, marks the 60th anniversary of that event. This paper attempts to look back at the nuclear analytical methods of the past, to look around and to see where the technology is right now, and finally, to look ahead to try and see where nuclear methods as an analytical technique (or as a group of analytical techniques) will be going in the future. The general areas which the author focuses on are: neutron activation analysis; prompt gamma neutron activation analysis (PGNAA); photon activation analysis (PAA); charged-particle activation analysis (CPAA)

  20. Analytical studies related to Indian PHWR containment system performance

    International Nuclear Information System (INIS)

    Haware, S.K.; Markandeya, S.G.; Ghosh, A.K.; Kushwaha, H.S.; Venkat Raj, V.

    1998-01-01

    Build-up of pressure in a multi-compartment containment after a postulated accident, the growth, transportation and removal of aerosols in the containment are complex processes of vital importance in deciding the source term. The release of hydrogen and its combustion increases the overpressure. In order to analyze these complex processes and to enable proper estimation of the source term, well tested analytical tools are necessary. This paper gives a detailed account of the analytical tools developed/adapted for PSA level 2 studies. (author)

  1. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    Science.gov (United States)

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  2. Analytic plane wave solutions for the quaternionic potential step

    International Nuclear Information System (INIS)

    De Leo, Stefano; Ducati, Gisele C.; Madureira, Tiago M.

    2006-01-01

    By using the recent mathematical tools developed in quaternionic differential operator theory, we solve the Schroedinger equation in the presence of a quaternionic step potential. The analytic solution for the stationary states allows one to explicitly show the qualitative and quantitative differences between this quaternionic quantum dynamical system and its complex counterpart. A brief discussion on reflected and transmitted times, performed by using the stationary phase method, and its implication on the experimental evidence for deviations of standard quantum mechanics is also presented. The analytic solution given in this paper represents a fundamental mathematical tool to find an analytic approximation to the quaternionic barrier problem (up to now solved by numerical method)

  3. Proceedings of the 11. ENQA: Brazilian meeting on analytical chemistry. Challenges for analytical chemistry in the 21st century. Book of Abstracts

    International Nuclear Information System (INIS)

    2001-01-01

    The 11th National Meeting on Analytical Chemistry was held from 18 to 21 September, 2001 at the Convention Center of UNICAMP, with the theme Challenges for Analytical Chemistry in the 21st Century. This meeting have discussed on the development of new methods and analytical tools needed to solve new challenges. The papers presented topics related to the different sub-areas of Analytical Chemistry such as Environmental Chemistry; Chemiometry techniques; X-ray Fluorescence Analysis; Spectroscopy; Separation Processes; Electroanalytic Chemistry and others. Were also included lectures on the Past and Future of Analytical Chemistry and on Ethics in Science

  4. Environmental tools in product development

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Hauschild, Michael Zwicky; Jørgensen, Jørgen

    1994-01-01

    A precondition for design of environmentally friendly products is that the design team has access to methods and tools supporting the introduction of environmental criteria in product development. A large Danish program, EDIP, is being carried out by the Institute for Product Development, Technical...... University of Denmark, in cooperation with 5 major Danish companies aiming at the development and testing of such tools. These tools are presented in this paper...

  5. PIXE as an analytical/educational tool

    International Nuclear Information System (INIS)

    Williams, E.T.

    1991-01-01

    The advantages and disadvantages of PIXE as an analytical method will be summarized. The authors will also discuss the advantages of PIXE as a means of providing interesting and feasible research projects for undergraduate science students

  6. Helicase-dependent isothermal amplification: a novel tool in the development of molecular-based analytical systems for rapid pathogen detection.

    Science.gov (United States)

    Barreda-García, Susana; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús

    2018-01-01

    Highly sensitive testing of nucleic acids is essential to improve the detection of pathogens, which pose a major threat for public health worldwide. Currently available molecular assays, mainly based on PCR, have a limited utility in point-of-need control or resource-limited settings. Consequently, there is a strong interest in developing cost-effective, robust, and portable platforms for early detection of these harmful microorganisms. Since its description in 2004, isothermal helicase-dependent amplification (HDA) has been successfully applied in the development of novel molecular-based technologies for rapid, sensitive, and selective detection of viruses and bacteria. In this review, we highlight relevant analytical systems using this simple nucleic acid amplification methodology that takes place at a constant temperature and that is readily compatible with microfluidic technologies. Different strategies for monitoring HDA amplification products are described. In addition, we present technological advances for integrating sample preparation, HDA amplification, and detection. Future perspectives and challenges toward point-of-need use not only for clinical diagnosis but also in food safety testing and environmental monitoring are also discussed. Graphical Abstract Expanding the analytical toolbox for the detection of DNA sequences specific of pathogens with isothermal helicase dependent amplification (HDA).

  7. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  8. Big data analytics in immunology: a knowledge-based approach.

    Science.gov (United States)

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  9. Big Data Analytics in Immunology: A Knowledge-Based Approach

    Directory of Open Access Journals (Sweden)

    Guang Lan Zhang

    2014-01-01

    Full Text Available With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  10. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    Science.gov (United States)

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., regimes.

  11. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  12. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  13. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  14. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  15. A Web-Based Geovisual Analytical System for Climate Studies

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2012-12-01

    Full Text Available Climate studies involve petabytes of spatiotemporal datasets that are produced and archived at distributed computing resources. Scientists need an intuitive and convenient tool to explore the distributed spatiotemporal data. Geovisual analytical tools have the potential to provide such an intuitive and convenient method for scientists to access climate data, discover the relationships between various climate parameters, and communicate the results across different research communities. However, implementing a geovisual analytical tool for complex climate data in a distributed environment poses several challenges. This paper reports our research and development of a web-based geovisual analytical system to support the analysis of climate data generated by climate model. Using the ModelE developed by the NASA Goddard Institute for Space Studies (GISS as an example, we demonstrate that the system is able to (1 manage large volume datasets over the Internet; (2 visualize 2D/3D/4D spatiotemporal data; (3 broker various spatiotemporal statistical analyses for climate research; and (4 support interactive data analysis and knowledge discovery. This research also provides an example for managing, disseminating, and analyzing Big Data in the 21st century.

  16. Developing a business analytics methodology: a case study in the foodbank sector

    OpenAIRE

    Hindle, Giles; Vidgen, Richard

    2017-01-01

    The current research seeks to address the following question: how can organizations align their business analytics development projects with their business goals? To pursue this research agenda we adopt an action research framework to develop and apply a business analytics methodology (BAM). The four-stage BAM (problem situation structuring, business model mapping, analytics leverage analysis, and analytics implementation) is not a prescription. Rather, it provides a logical structure and log...

  17. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  18. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    International Nuclear Information System (INIS)

    Hervas, Miriam; Lopez, Miguel Angel; Escarpa, Alberto

    2009-01-01

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 μg L -1 and EC 50 0.079 μg L -1 were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  19. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    Science.gov (United States)

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  20. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  1. Locally analytic vectors in representations of locally

    CERN Document Server

    Emerton, Matthew J

    2017-01-01

    The goal of this memoir is to provide the foundations for the locally analytic representation theory that is required in three of the author's other papers on this topic. In the course of writing those papers the author found it useful to adopt a particular point of view on locally analytic representation theory: namely, regarding a locally analytic representation as being the inductive limit of its subspaces of analytic vectors (of various "radii of analyticity"). The author uses the analysis of these subspaces as one of the basic tools in his study of such representations. Thus in this memoir he presents a development of locally analytic representation theory built around this point of view. The author has made a deliberate effort to keep the exposition reasonably self-contained and hopes that this will be of some benefit to the reader.

  2. Constraint-Referenced Analytics of Algebra Learning

    Science.gov (United States)

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  3. Capturing student mathematical engagement through differently enacted classroom practices: applying a modification of Watson's analytical tool

    Science.gov (United States)

    Patahuddin, Sitti Maesuri; Puteri, Indira; Lowrie, Tom; Logan, Tracy; Rika, Baiq

    2018-04-01

    This study examined student mathematical engagement through the intended and enacted lessons taught by two teachers in two different middle schools in Indonesia. The intended lesson was developed using the ELPSA learning design to promote mathematical engagement. Based on the premise that students will react to the mathematical tasks in the forms of words and actions, the analysis focused on identifying the types of mathematical engagement promoted through the intended lesson and performed by students during the lesson. Using modified Watson's analytical tool (2007), students' engagement was captured from what the participants' did or said mathematically. We found that teachers' enacted practices had an influence on student mathematical engagement. The teacher who demonstrated content in explicit ways tended to limit the richness of the engagement; whereas the teacher who presented activities in an open-ended manner fostered engagement.

  4. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  5. Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.

    Science.gov (United States)

    Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert

    2017-07-01

    The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  7. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  8. Analytical developments in reprocessing at the CEA

    International Nuclear Information System (INIS)

    Buffereau, M.

    1989-01-01

    Analytical developments in reprocessing, which are based on extensive basic research, are aimed at fulfilling current requirements of R and D laboratories, pilot plants and industrial plants. They are also intended to propose and provide new opportunities. On-line measurements are a long term goal. One must be confident of their outcome. New equipment and procedures must be tested and their specifications determined, first at the laboratory level, and then in a pilot plant. In this respect we are considering equipment which will be in operation in the ATALANTE laboratories. And APM is also both a necessary and useful resource. However, many measurements must still be done and will continue to have to be done in analytical laboratories. Along with the improvement of accuracy the main developments aim at reducing manpower requirements and effluents and waste releases

  9. An analytical simulation technique for cone-beam CT and pinhole SPECT

    International Nuclear Information System (INIS)

    Zhang Xuezhu; Qi Yujin

    2011-01-01

    This study was aimed at developing an efficient simulation technique with an ordinary PC. The work involved derivation of mathematical operators, analytic phantom generations, and effective analytical projectors developing for cone-beam CT and pinhole SPECT imaging. The computer simulations based on the analytical projectors were developed by ray-tracing method for cone-beam CT and voxel-driven method for pinhole SPECT of degrading blurring. The 3D Shepp-Logan, Jaszczak and Defrise phantoms were used for simulation evaluations and image reconstructions. The reconstructed phantom images were of good accuracy with the phantoms. The results showed that the analytical simulation technique is an efficient tool for studying cone-beam CT and pinhole SPECT imaging. (authors)

  10. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    Science.gov (United States)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  11. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  12. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  13. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  14. Development of integrated analytical data management system

    International Nuclear Information System (INIS)

    Onishi, Koichi; Wachi, Isamu; Hiroki, Toshio

    1986-01-01

    The Analysis Subsection of Technical Service Section, Tokai Reprocessing Plant, Tokai Works, is engaged in analysis activities required for the management of processes and measurements in the plant. Currently, it has been desired to increase the reliability of analytical data and to perform analyses more rapidly to cope with the increasing number of analysis works. To meet this end, on-line data processing has been promoted and advanced analytical equipment has been introduced in order to enhance automization. In the present study, an integrated analytical data mangement system is developed which serves for improvement of reliability of analytical data as well as for rapid retrieval and automatic compilation of these data. Fabrication of a basic model of the system has been nearly completed and test operation has already been started. In selecting hardware to be used, examinations were made on easiness of system extension, Japanese language processing function for improving man-machine interface, large-capacity auxiliary memory system, and data base processing function. The existing analysis works wer reviewed in establishing the basic design of the system. According to this basic design, the system can perform such works as analysis of application slips received from clients as well as recording, sending, filing and retrieval of analysis results. (Nogami, K.)

  15. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  16. Analytical tools for thermal infrared engineerig: a thermal sensor simulation package

    Science.gov (United States)

    Jaggi, Sandeep

    1992-09-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration. To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering'--ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as SNR, NER, NETD etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters. In addition, ATTIRE can be used as a tutorial for understanding the distribution of thermal flux or solar irradiance over selected bandwidths of the spectrum. This spectrally distributed incident flux can then be analyzed as it propagates through the subsystems that constitute the entire sensor. ATTIRE provides a variety of functions ranging from plotting black-body curves for varying bandwidths and computing the integral flux, to performing transfer function analysis of the sensor system. The package runs from a menu- driven interface in a PC-DOS environment. Each sub-system of the sensor is represented by windows and icons. A user-friendly mouse-controlled point-and-click interface allows the user to simulate various aspects of a sensor. The package can simulate a theoretical sensor system. Trade-off studies can be easily done by changing the appropriate parameters and monitoring the effect of the system performance. The package can provide plots of system performance versus any system parameter. A parameter (such as the entrance aperture of the optics) could be varied and its effect on another parameter (e.g., NETD) can be plotted. A third parameter (e.g., the

  17. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    Science.gov (United States)

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.

  18. DEVELOPMENT OF REMOTE TOOLS TO ASSESS THE EFFECTIVENESS AND QUALITY OF CAR SERVICE ENTERPRISES WORK

    Directory of Open Access Journals (Sweden)

    Vladimir Kozlovskiy

    2017-09-01

    Full Text Available The paper is updated development problem of remote tools to assess the effectiveness and work quality of the corporate car service enterprises. This paper presents the author’s methodology of activity estimation car service enterprises based on corporate information reflecting the report on the production activities of the service company. Specialized information system has been developed and implemented based on the proposed methodology. Specialized information system is the analytical tool for assessing the activities of enterprises the brand network of car service, for one of the largest national carmakers. The aim of research is development and realization of a monitoring system for certain areas of the car service work that significantly affect the quality process of the maintenance service. In addition, the paper is devoted to solving urgent issues of collection and processing of real data on warranty defects of cars.

  19. Newspaper Reading among College Students in Development of Their Analytical Ability

    Science.gov (United States)

    Kumar, Dinesh

    2009-01-01

    The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…

  20. Comparison of the most used game development tools

    OpenAIRE

    Soukup, Martin

    2017-01-01

    This thesis deals with comparison of most used game development tools. Author places game development tools in context of today´s game industry, analyses state of the market and the latest trends in the field of game development tools. The largest part of this thesis is aimed at comparing game development tools, where five tools are selected, overviewed and compared by specified criteria. Author demonstrates several basic features of chosen game development tool on development of a simple And...

  1. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Rakitin, I.D.; Malkin, S.D.; Shalia, V.V.; Fedorov, E.M.; Lebedev, N.N.; Khoudiakov, M.M.

    1999-01-01

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  2. The Earth Data Analytic Services (EDAS) Framework

    Science.gov (United States)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  3. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    Energy Technology Data Exchange (ETDEWEB)

    Ragan, Eric D [ORNL; Goodall, John R [ORNL

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  4. Evolutionary developments in x ray and electron energy loss microanalysis instrumentation for the analytical electron microscope

    Science.gov (United States)

    Zaluzec, Nester J.

    Developments in instrumentation for both X ray Dispersive and Electron Energy Loss Spectroscopy (XEDS/EELS) over the last ten years have given the experimentalist a greatly enhanced set of analytical tools for characterization. Microanalysts have waited for nearly two decades now in the hope of getting a true analytical microscope and the development of 300 to 400 kV instruments should have allowed us to attain this goal. Unfortunately, this has not generally been the case. While there have been some major improvements in the techniques, there has also been some devolution in the modern AEM (Analytical Electron Microscope). In XEDS, the majority of today's instruments are still plagued by the hole count effect, which was first described in detail over fifteen years ago. The magnitude of this problem can still reach the 20 percent level for medium atomic number species in a conventional off-the-shelf intermediate voltage AEM. This is an absurd situation and the manufacturers should be severely criticized. Part of the blame, however, also rests on the AEM community for not having come up with a universally agreed upon standard test procedure. Fortunately, such a test procedure is in the early stages of refinement. The proposed test specimen consists of an evaporated Cr film approx. 500 to 1000A thick supported upon a 3mm diameter Molybdenum 200 micron aperture.

  5. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  6. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... provides information about e.g. subjects of interest, searching behaviour, browsing patterns in website structure as well as tag clouds, page views. The poster discusses benefits and challenges of the two web metrics, with a model of how to use search and tag data for the design of taxonomies, e.g. choice...

  7. Developing a Social Media Marketing tool

    OpenAIRE

    Valova, Olga

    2015-01-01

    The objective of the thesis is to develop a better, easier to use social media marketing tool that could be utilised in any business. By understanding and analysing how business uses social media as well as currently available social media marketing tools, design a tool with the maximum amount of features, but with a simple and intuitive User Interface. An agile software development life cycle was used throughout the creation of the tool. Qualitative analysis was used to analyse existing ...

  8. Tools for studying dry-cured ham processing by using computed tomography.

    Science.gov (United States)

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  9. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  10. Fourier transform ion cyclotron resonance mass spectrometry: the analytical tool for heavy oil and bitumen characterization

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Thomas B.P; Brown, Melisa; Hsieh, Ben; Larter, Steve [Petroleum Reservoir Group (prg), Department of Geoscience, University of Calgary, Alberta (Canada)

    2011-07-01

    The Fourier Transform Ion Cyclotron Resonance Mass Spectrometry (FTICRMS), developed in the 1970's by Marshall and Comisarow at the University of British Columbia, has become a commercially available tool capable of analyzing several hundred thousand components in a petroleum mixture at once. This analytical technology will probably usher a dramatic revolution in geochemical capability, equal to or greater than the molecular revolution that occurred when GCMS technologies became cheaply available. The molecular resolution and information content given by the FTICRMS petroleum analysis can be compared to the information in the human genome. With current GCMS-based petroleum geochemical protocols perhaps a few hundred components can be quantitatively determined, but with FTICRMS, 1000 times this number of components could possibly be resolved. However, fluid and other properties depend on interaction of this multitude of hydrocarbon and non-hydrocarbon components, not the components themselves, and access to the full potential of this new petroleomics will depend on the definition of this interactome.

  11. Tool to Prioritize Energy Efficiency Investments

    Energy Technology Data Exchange (ETDEWEB)

    Farese, Philip [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gelman, Rachel [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hendron, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2012-08-01

    To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.

  12. Quality Indicators for Learning Analytics

    Science.gov (United States)

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  13. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  14. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David; Wolverton, Michael J.; Bruce, Joseph R.; Burtner, Edwin R.; Endert, Alexander

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying models of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.

  15. History of a secondary side inspection tooling development

    International Nuclear Information System (INIS)

    Harris, W.

    2012-01-01

    This presentation provides a brief history (1980 to present day) of steam generator secondary side tooling requirements, tooling development, tooling available today and how and where this tooling has been implemented for steam generator secondary side inspections. History of Tooling Development discussion covers the relatively short time period from when the SGSS tooling was required and why as well the associated challenges with development through present day; Available Tooling discussion covers the actual tooling available today, locations in the steam generator where the tooling is used and how the tooling works; Implementation discussion covers where in the world this tooling has been deployed as well the benefits the tooling has provided. (author)

  16. Developing a Parametric Urban Design Tool

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Obeling, Esben

    2014-01-01

    Parametric urban design is a potentially powerful tool for collaborative urban design processes. Rather than making one- off designs which need to be redesigned from the ground up in case of changes, parametric design tools make it possible keep the design open while at the same time allowing...... for a level of detailing which is high enough to facilitate an understan- ding of the generic qualities of proposed designs. Starting from a brief overview of parametric design, this paper presents initial findings from the development of a parametric urban design tool with regard to developing a structural...... logic which is flexible and expandable. It then moves on to outline and discuss further development work. Finally, it offers a brief reflection on the potentials and shortcomings of the software – CityEngine – which is used for developing the parametric urban design tool....

  17. Recent developments in analytical detection methods for radiation processed foods

    International Nuclear Information System (INIS)

    Wu Jilan

    1993-01-01

    A short summary of the programmes of 'ADMIT' (FAO/IAEA) and the developments in analytical detection methods for radiation processed foods has been given. It is suggested that for promoting the commercialization of radiation processed foods and controlling its quality, one must pay more attention to the study of analytical detection methods of irradiated food

  18. Development of support tools for efficient construction of dynamic simulation program for engineering systems

    International Nuclear Information System (INIS)

    Gofuku, Akio

    1993-01-01

    In this study, two support tools are developed for construction of a dynamic simulation program for engineering systems (especially nuclear systems) by combining software modules. These are (1) a sub-system to support the module selection suitable for dynamic simulation and (2) a graphical user interface to support visual construction of simulation programs. The support tools are designed to be independent on the conception of software modules (data communication methods between modules). In the module selection sub-system of item 1, a module is characterized beforehand by keywords for several criteria. The similarity between the characteristic of requested module by users and that of registered modules in the module library is estimated by a weighted average of similarity indexes for criteria. In the module selection sub-system, the weights are flexibly extracted from users by applying the analytic hierarchy process. The graphical user interface helps users to specify both calling order of modules and data transfer between two modules. The availability of the support tools is evaluated by several sample problems of module selection and dynamic simulation model construction. The support tools will be a strong tool for the efficient usage of software modules. (author)

  19. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  20. Development and first application of an operating events ranking tool

    International Nuclear Information System (INIS)

    Šimić, Zdenko; Zerger, Benoit; Banov, Reni

    2015-01-01

    Highlights: • A method using analitycal hierarchy process for ranking operating events is developed and tested. • The method is applied for 5 years of U.S. NRC Licensee Event Reports (1453 events). • Uncertainty and sensitivity of the ranking results are evaluated. • Real events assessment shows potential of the method for operating experience feedback. - Abstract: The operating experience feedback is important for maintaining and improving safety and availability in nuclear power plants. Detailed investigation of all events is challenging since it requires excessive resources, especially in case of large event databases. This paper presents an event groups ranking method to complement the analysis of individual operating events. The basis for the method is the use of an internationally accepted events characterization scheme that allows different ways of events grouping and ranking. The ranking method itself consists of implementing the analytical hierarchy process (AHP) by means of a custom developed tool which allows events ranking based on ranking indexes pre-determined by expert judgment. Following the development phase, the tool was applied to analyze a complete set of 5 years of real nuclear power plants operating events (1453 events). The paper presents the potential of this ranking method to identify possible patterns throughout the event database and therefore to give additional insights into the events as well as to give quantitative input for the prioritization of further more detailed investigation of selected event groups

  1. The MSCA Program: Developing Analytic Unicorns

    Science.gov (United States)

    Houghton, David M.; Schertzer, Clint; Beck, Scott

    2018-01-01

    Marketing analytics students who can communicate effectively with decision makers are in high demand. These "analytic unicorns" are hard to find. The Master of Science in Customer Analytics (MSCA) degree program at Xavier University seeks to fill that need. In this paper, we discuss the process of creating the MSCA program. We outline…

  2. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  3. Analytical and clinical performances of immunoradiometric assay of total and free PSA developed locally

    International Nuclear Information System (INIS)

    Boucekkine, N.; Korso, R.; Bellazoug, K.; Ferd, N.; Bouyoucef, S.E.; Boudjemai, S.; Benzaid, A.; Bouhila, Z.

    2002-01-01

    A specific assay was developed for total and free PSA (PSAt, PSAf). Both assay use a two site IRMA with polyclonal anti PSA antibodies coated on tubes. Polyclonal antibodies were obtained after rabbit's immunisation using an under skin injection of pure PSA in multiple site. For quantification, two monoclonal antibodies were selected, the first highly specific to free PSA and the second recognising both free and bound PSA. A correlation study was performed comparatively with two commercial kits from CIS Bio and Immunotech. For that purpose, 464 serums samples ranging from 0.5 ng/ml to 3399 ng/ml were used to characterise the analytical performance of the new test. The analytical detection limit of the new test was equal to 0.05 ng/ml for the total PSA and 0.02ng/ml for the free PSA. The within run and between-day coefficients of variation were to 20 ng/ml. For BPH, no significant difference was found between the three test for the ratio PSAf/PSAt using a cut off of 14% (all were>to 14%). For the 120 patients with PC, all PSAt were > to 2 ng/ml. However the mean value of PSAt was higher for the commercial kits (14.74 ng/ml against 12.48ng/ml for the new test) but all ratio of PSAf/PSAt for the 120 newly diagnosed cancer were <14%. In conclusion, our immunoradiometric assay developed locally has a good analytical performance and its outputs are well correlated to clinical findings in prostate disease. Furthermore, a cut off of 14% for the ratio PSAf/PSAt appears to be the most accurate tools to depict a prostate cancer

  4. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  5. Biomarkers as drug development tools: discovery, validation, qualification and use.

    Science.gov (United States)

    Kraus, Virginia B

    2018-06-01

    The 21st Century Cures Act, approved in the USA in December 2016, has encouraged the establishment of the national Precision Medicine Initiative and the augmentation of efforts to address disease prevention, diagnosis and treatment on the basis of a molecular understanding of disease. The Act adopts into law the formal process, developed by the FDA, of qualification of drug development tools, including biomarkers and clinical outcome assessments, to increase the efficiency of clinical trials and encourage an era of molecular medicine. The FDA and European Medicines Agency (EMA) have developed similar processes for the qualification of biomarkers intended for use as companion diagnostics or for development and regulatory approval of a drug or therapeutic. Biomarkers that are used exclusively for the diagnosis, monitoring or stratification of patients in clinical trials are not subject to regulatory approval, although their qualification can facilitate the conduct of a trial. In this Review, the salient features of biomarker discovery, analytical validation, clinical qualification and utilization are described in order to provide an understanding of the process of biomarker development and, through this understanding, convey an appreciation of their potential advantages and limitations.

  6. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    Science.gov (United States)

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  7. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  8. 105-KE Basin isolation barrier leak rate test analytical development. Revision 1

    International Nuclear Information System (INIS)

    Irwin, J.J.

    1995-01-01

    This document provides an analytical development in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used to determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under postulated accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage: viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  9. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  10. Development and testing of analytical models for the pebble bed type HTRs

    International Nuclear Information System (INIS)

    Huda, M.Q.; Obara, T.

    2008-01-01

    The pebble bed type gas cooled high temperature reactor (HTR) appears to be a good candidate for the next generation nuclear reactor technology. These reactors have unique characteristics in terms of the randomness in geometry, and require special techniques to analyze their systems. This study includes activities concerning the testing of computational tools and the qualification of models. Indeed, it is essential that the validated analytical tools be available to the research community. From this viewpoint codes like MCNP, ORIGEN and RELAP5, which have been used in nuclear industry for many years, are selected to identify and develop new capabilities needed to support HTR analysis. The geometrical model of the full reactor is obtained by using lattice and universe facilities provided by MCNP. The coupled MCNP-ORIGEN code is used to estimate the burnup and the refuelling scheme. Results obtained from Monte Carlo analysis are interfaced with RELAP5 to analyze the thermal hydraulics and safety characteristics of the reactor. New models and methodologies are developed for several past and present experimental and prototypical facilities that were based on HTR pebble bed concepts. The calculated results are compared with available experimental data and theoretical evaluations showing very good agreement. The ultimate goal of the validation of the computer codes for pebble bed HTR applications is to acquire and reinforce the capability of these general purpose computer codes for performing HTR core design and optimization studies

  11. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  12. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  13. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  14. 78 FR 68459 - Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug...

    Science.gov (United States)

    2013-11-14

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-D-1279] Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug Administration Staff; Availability AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food...

  15. State-of-the-Art of (Bio)Chemical Sensor Developments in Analytical Spanish Groups

    Science.gov (United States)

    Plata, María Reyes; Contento, Ana María; Ríos, Angel

    2010-01-01

    (Bio)chemical sensors are one of the most exciting fields in analytical chemistry today. The development of these analytical devices simplifies and miniaturizes the whole analytical process. Although the initial expectation of the massive incorporation of sensors in routine analytical work has been truncated to some extent, in many other cases analytical methods based on sensor technology have solved important analytical problems. Many research groups are working in this field world-wide, reporting interesting results so far. Modestly, Spanish researchers have contributed to these recent developments. In this review, we summarize the more representative achievements carried out for these groups. They cover a wide variety of sensors, including optical, electrochemical, piezoelectric or electro-mechanical devices, used for laboratory or field analyses. The capabilities to be used in different applied areas are also critically discussed. PMID:22319260

  16. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    Science.gov (United States)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  17. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    Massa, R.J.

    1987-01-01

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  18. Sustainable Urban Development: Spatial Analyses as Novel Tools for Planning a Universally Designed City

    Directory of Open Access Journals (Sweden)

    Joanna Borowczyk

    2018-05-01

    Full Text Available The aim of the research was to analyze the “design for all” concept as a key strategy for creating social sustainability. The paper attempts to answer the question: how can universal design contribute to the rational development of the city space? The author has taken part in participatory experiments. The research took into account various criteria, including the level of the city space’s adaptation to the needs and capabilities of persons with different disabilities. Analyses included qualitative studies concerning the possibilities of developing the social capital as well as creating and preserving a cohesive social structure. The analytic process allowed determining the means of raising the quality of urban planning. Finding effective and reliable analytical tools enabling the development of healthy cities which are compatible with the principles of sustainability could become both a great chance and a great challenge for urban planners. Transition from the microplanning to the macroplanning scale and following the principles of universal design at the stage of the formation of urban concepts using spatiotemporal modelling methods will lead to the creation of harmonious accessible spaces adjusted to the needs of present and future users, which will generate sustainable development and lead to the healing of a city.

  19. Characterization, thermal stability studies, and analytical method development of Paromomycin for formulation development.

    Science.gov (United States)

    Khan, Wahid; Kumar, Neeraj

    2011-06-01

    Paromomycin (PM) is an aminoglycoside antibiotic, first isolated in the 1950s, and approved in 2006 for treatment of visceral leishmaniasis. Although isolated six decades back, sufficient information essential for development of pharmaceutical formulation is not available for PM. The purpose of this paper was to determine thermal stability and development of new analytical method for formulation development of PM. PM was characterized by thermoanalytical (DSC, TGA, and HSM) and by spectroscopic (FTIR) techniques and these techniques were used to establish thermal stability of PM after heating PM at 100, 110, 120, and 130 °C for 24 h. Biological activity of these heated samples was also determined by microbiological assay. Subsequently, a simple, rapid and sensitive RP-HPLC method for quantitative determination of PM was developed using pre-column derivatization with 9-fluorenylmethyl chloroformate. The developed method was applied to estimate PM quantitatively in two parenteral dosage forms. PM was successfully characterized by various stated techniques. These techniques indicated stability of PM for heating up to 120 °C for 24 h, but when heated at 130 °C, PM is liable to degradation. This degradation is also observed in microbiological assay where PM lost ∼30% of its biological activity when heated at 130 °C for 24 h. New analytical method was developed for PM in the concentration range of 25-200 ng/ml with intra-day and inter-day variability of stability of PM was determined successfully. Developed analytical method was found sensitive, accurate, and precise for quantification of PM. Copyright © 2010 John Wiley & Sons, Ltd. Copyright © 2010 John Wiley & Sons, Ltd.

  20. Customer Intelligence Analytics on Social Networks

    Directory of Open Access Journals (Sweden)

    Brano MARKIĆ

    2016-08-01

    Full Text Available Discovering needs, habits and consumer behavior is the primary task of marketing analytics. It is necessary to integrate marketing and analytical skills with IT skills. Such knowledge integration allows access to data (structured and unstructured, their analysis and finding out information about the opinions, attitudes, needs and behavior of customers. In the paper is set the hypothesis that software tools can collect data (messages from social networks, analyze the content of messages and get to know the attitudes of customers about a product, service, tourist destination with the ultimate goal of improving customer relations. Experimental results are based on the analysis of the content of social network Facebook by using the package and function R language. This language showed a satisfactory application and development power in analysis of textual data on social networks for marketing analytics.

  1. Development of a transportation planning tool

    International Nuclear Information System (INIS)

    Funkhouser, B.R.; Moyer, J.W.; Ballweg, E.L.

    1994-01-01

    This paper describes the application of simulation modeling and logistics techniques to the development of a planning tool for the Department of Energy (DOE). The focus of the Transportation Planning Model (TPM) tool is to aid DOE and Sandia analysts in the planning of future fleet sizes, driver and support personnel sizes, base site locations, and resource balancing among the base sites. The design approach is to develop a rapid modeling environment which will allow analysts to easily set up a shipment scenario and perform multiple ''what if'' evaluations. The TPM is being developed on personal computers using commercial off-the shelf (COTS) software tools under the WINDOWS reg-sign operating environment. Prototype development of the TPM has been completed

  2. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    Science.gov (United States)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  3. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Marek Tobiszewski

    2015-06-01

    Full Text Available The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  4. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    Science.gov (United States)

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  5. Manufacturing data analytics using a virtual factory representation.

    Science.gov (United States)

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  6. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  7. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    Science.gov (United States)

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  8. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  9. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  10. Process analytical technology (PAT) for biopharmaceuticals

    DEFF Research Database (Denmark)

    Glassey, Jarka; Gernaey, Krist; Clemens, Christoph

    2011-01-01

    Process analytical technology (PAT), the regulatory initiative for building in quality to pharmaceutical manufacturing, has a great potential for improving biopharmaceutical production. The recommended analytical tools for building in quality, multivariate data analysis, mechanistic modeling, novel...

  11. Analytical Chemistry in the Regulatory Science of Medical Devices.

    Science.gov (United States)

    Wang, Yi; Guan, Allan; Wickramasekara, Samanthi; Phillips, K Scott

    2018-06-12

    In the United States, regulatory science is the science of developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of all Food and Drug Administration-regulated products. Good regulatory science facilitates consumer access to innovative medical devices that are safe and effective throughout the Total Product Life Cycle (TPLC). Because the need to measure things is fundamental to the regulatory science of medical devices, analytical chemistry plays an important role, contributing to medical device technology in two ways: It can be an integral part of an innovative medical device (e.g., diagnostic devices), and it can be used to support medical device development throughout the TPLC. In this review, we focus on analytical chemistry as a tool for the regulatory science of medical devices. We highlight recent progress in companion diagnostics, medical devices on chips for preclinical testing, mass spectrometry for postmarket monitoring, and detection/characterization of bacterial biofilm to prevent infections.

  12. Android development tools for Eclipse

    CERN Document Server

    Shah, Sanjay

    2013-01-01

    A standard tutorial aimed at developing Android applications in a practical manner.Android Development Tools for Eclipse is aimed at beginners and existing developers who want to learn more about Android development. It is assumed that you have experience in Java programming and that you have used IDE for development.

  13. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  14. Pilot evaluation of a continuing professional development tool for developing leadership skills.

    Science.gov (United States)

    Patterson, Brandon J; Chang, Elizabeth H; Witry, Matthew J; Garza, Oscar W; Trewet, CoraLynn B

    2013-01-01

    Strategies are needed to assure essential nonclinical competencies, such as leadership, can be gained using a continuing professional development (CPD) framework. The objective of this study was to explore student pharmacists' utilization and perceived effectiveness of a CPD tool for leadership development in an elective course. Students completed 2 CPD cycles during a semester-long leadership elective using a CPD tool. A questionnaire was used to measure students' perceptions of utility, self-efficacy, and satisfaction in completing CPD cycles when using a tool to aid in this process. The CPD tool was completed twice by 7 students. On average, students spent nearly 5 hours per CPD cycle. More than half (57.1%) scored themselves as successful or very successful in achieving their learning plans, and most (71.4%) found the tool somewhat useful in developing their leadership skills. Some perceived that the tool provided a systematic way to engage in leadership development, whereas others found it difficult to use. In this pilot study, most student pharmacists successfully achieved a leadership development plan and found the CPD tool useful. Providing students with more guidance may help facilitate use and effectiveness of CPD tools. There is a need to continue to develop and refine tools that assist in the CPD of pharmacy practitioners at all levels. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Development of analytical techniques in support of waste and effluent characterization

    International Nuclear Information System (INIS)

    Reed, W.J.

    1991-01-01

    The Analytical Services Group within Sellafield Technical Department has been established for >40 yr and employs >150 analysts. The group operates >400 analytical methods across a wide range of techniques and has a yearly workload of ∼250,000 determinations. The group operates under a quality system based on statistical process control that has achieved national recognition through the accreditation of its mass spectrometry and radiochemical services to the standard of national testing laboratories. The group offers services ranging from the characterization of highly active wastes to trace elemental and radiochemical measurements in environmental, biological, and effluent streams. The group has vast experience in the management of analytical services to tight time scales and has pioneered developments not only in analytical instrumentation, but also in the adaptation of equipment to radioactive environments and the design of dedicated analytical facilities

  16. Remote tool development for nuclear dismantling operations

    International Nuclear Information System (INIS)

    Craig, G.; Ferlay, J.C.; Ieracitano, F.

    2003-01-01

    Remote tool systems to undertake nuclear dismantling operations require careful design and development not only to perform their given duty but to perform it safely within the constraints imposed by harsh environmental conditions. Framatome ANP NUCLEAR SERVICES has for a long time developed and qualified equipment to undertake specific maintenance operations of nuclear reactors. The tool development methodology from this activity has since been adapted to resolve some very challenging reactor dismantling operations which are demonstrated in this paper. Each nuclear decommissioning project is a unique case, technical characterisation data is generally incomplete. The development of the dismantling methodology and associated equipment is by and large an iterative process combining design and simulation with feasibility and validation testing. The first stage of the development process involves feasibility testing of industrial tools and examining adaptations necessary to control and deploy the tool remotely with respect to the chosen methodology and environmental constraints. This results in a prototype tool and deployment system to validate the basic process. The second stage involves detailed design which integrates any remaining technical and environmental constraints. At the end of this stage, tools and deployment systems, operators and operating procedures are qualified on full scale mock ups. (authors)

  17. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  18. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    Science.gov (United States)

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  19. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  20. Capitalizing on App Development Tools and Technologies

    Science.gov (United States)

    Luterbach, Kenneth J.; Hubbell, Kenneth R.

    2015-01-01

    Instructional developers and others creating apps must choose from a wide variety of app development tools and technologies. Some app development tools have incorporated visual programming features, which enable some drag and drop coding and contextual programming. While those features help novices begin programming with greater ease, questions…

  1. Training the next generation analyst using red cell analytics

    Science.gov (United States)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  2. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  3. CorRECTreatment: a web-based decision support tool for rectal cancer treatment that uses the analytic hierarchy process and decision tree.

    Science.gov (United States)

    Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C

    2015-01-01

    The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratiodecisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.

  4. Development of remote handling tools and equipment

    International Nuclear Information System (INIS)

    Nakahira, Masataka; Oka, Kiyoshi; Taguchi, Kou; Ito, Akira; Fukatsu, Seiichi; Oda, Yasushi; Kajiura, Soji; Yamazaki, Seiichiro; Aoyama, Kazuo.

    1997-01-01

    The remote handling (RH) tools and equipment development in ITER focuses mainly on the welding and cutting technique, weld inspection and double-seal door which are essential factors in the replacement of in-vessel components such as divertor and blanket. The conceptual design of these RH tools and equipment has been defined through ITER engineering design activity (EDA). Similarly, elementary R and D of the RH tools and equipment have been extensively performed to accumulate a technological data base for process and performance qualification. Based on this data, fabrications of full-scale RH tools and equipment are under progress. A prototypical bore tool for pipe welding and cutting has already been fabricated and is currently undergoing integrated performance tests. This paper describes the design outline of the RH tools and equipment related to in-vessel components maintenance, and highlights the current status of RH tools and equipment development by the Japan Home Team as an ITER R and D program. This paper also includes an outline of insulation joint and quick-pipe connector development, which has also been conducted through the ITER R and D program in order to standardize RH operations and components. (author)

  5. An analytical model for the study of a small LFR core dynamics: development and benchmark

    International Nuclear Information System (INIS)

    Bortot, S.; Cammi, A.; Lorenzi, S.; Moisseytsev, A.

    2011-01-01

    An analytical model for the study of a small Lead-cooled Fast Reactor (LFR) control-oriented dynamics has been developed aimed at providing a useful, very flexible and straightforward, though accurate, tool allowing relatively quick transient design-basis and stability analyses. A simplified lumped-parameter approach has been adopted to couple neutronics and thermal-hydraulics: the point-kinetics approximation has been employed and an average-temperature heat-exchange model has been implemented. The reactor transient responses following postulated accident initiators such as Unprotected Control Rod Withdrawal (UTOP), Loss of Heat Sink (ULOHS) and Loss of Flow (ULOF) have been studied for a MOX and a metal-fuelled core at the Beginning of Cycle (BoC) and End of Cycle (EoC) configurations. A benchmark analysis has been then performed by means of the SAS4A/SASSYS-1 Liquid Metal Reactor Code System, in which a core model based on three representative channels has been built with the purpose of providing verification for the analytical outcomes and indicating how the latter relate to more realistic one-dimensional calculations. As a general result, responses concerning the main core characteristics (namely, power, reactivity, etc.) have turned out to be mutually consistent in terms of both steady-state absolute figures and transient developments, showing discrepancies of the order of only some percents, thus confirming a very satisfactory agreement. (author)

  6. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    Science.gov (United States)

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  7. Next generation interactive tool as a backbone for universal access to electricity

    DEFF Research Database (Denmark)

    Moner-Girona, Magda; Puig, Daniel; Mulugetta, Yacob

    2018-01-01

    Energy planning in rural areas and in developing countries most often relies on the outputs of specialised analytical tools, of which only a handful have been developed. Over the years these tools have been upgraded, and the newest among them take into consideration, to a greater or lesser extent...

  8. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  9. Updates in metabolomics tools and resources: 2014-2015.

    Science.gov (United States)

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Developing a MATLAB(registered)-Based Tool for Visualization and Transformation

    Science.gov (United States)

    Anderton, Blake J.

    2003-01-01

    An important step in the structural design and development of spacecraft is the experimental identification of a structure s modal characteristics, such as its natural frequencies and modes of vibration. These characteristics are vital to developing a representative model of any given structure or analyzing the range of input frequencies that can be handled by a particular structure. When setting up such a representative model of a structure, careful measurements using precision equipment (such as accelerometers and instrumented hammers) must be made on many individual points of the structure in question. The coordinate location of each data point is used to construct a wireframe geometric model of the structure. Response measurements obtained from the accelerometers is used to generate the modal shapes of the particular structure. Graphically, this is displayed as a combination of the ways a structure will ideally respond to a specified force input. Two types of models of the tested structure are often used in modal analysis: an analytic model showing expected behavior of the structure, and an experimental model showing measured results due to observed phenomena. To evaluate the results from the experimental model, a comparison of analytic and experimental results must be made between the two models. However, comparisons between these two models become difficult when the two coordinate orientations differ in a manner such that results are displayed in an unclear fashion. Such a problem proposes the need for a tool that not only communicates a graphical image of a structure s wireframe geometry based on various measurement locations (called nodes), but also allows for a type of transformation of the image s coordinate geometry so that a model s coordinate orientation is made to match the orientation of another model. Such a tool should also be designed so that it is able to construct coordinate geometry based on many different listings of node locations and is able

  11. Develop risk-based procurement management tools for SMEs

    NARCIS (Netherlands)

    Staal, Anne; Hagelaar, Geoffrey; Walhof, Gert; Holman, Richard

    2016-01-01

    This paper provides guidance for developing risk-based management tools to improve the procurement (purchasing) performance of SMEs. Extant academic literature only offers little support on developing such tools and does not consider the wide variety of SMEs. The paper defines a procurement tool for

  12. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  13. Data Analytics in CRM Processes: A Literature Review

    Directory of Open Access Journals (Sweden)

    Gončarovs Pāvels

    2017-12-01

    Full Text Available Nowadays, the data scarcity problem has been supplanted by the data deluge problem. Marketers and Customer Relationship Management (CRM specialists have access to rich data on consumer behaviour. The current challenge is effective utilisation of these data in CRM processes and selection of appropriate data analytics techniques. Data analytics techniques help find hidden patterns in data. The present paper explores the characteristics of data analytics as the integrated tool in CRM for sales managers. The paper aims at analysing some of the different analytics methods and tools which can be used for continuous improvement of CRM processes. A systematic literature has been conducted to achieve this goal. The results of the review highlight the most frequently considered CRM processes in the context of data analytics.

  14. Applying CASE Tools for On-Board Software Development

    Science.gov (United States)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  15. Development and Validation of a Hypersonic Vehicle Design Tool Based On Waverider Design Technique

    Science.gov (United States)

    Dasque, Nastassja

    Methodologies for a tool capable of assisting design initiatives for practical waverider based hypersonic vehicles were developed and validated. The design space for vehicle surfaces was formed using an algorithm that coupled directional derivatives with the conservation laws to determine a flow field defined by a set of post-shock streamlines. The design space is used to construct an ideal waverider with a sharp leading edge. A blunting method was developed to modify the ideal shapes to a more practical geometry for real-world application. Empirical and analytical relations were then systematically applied to the resulting geometries to determine local pressure, skin-friction and heat flux. For the ideal portion of the geometry, flat plate relations for compressible flow were applied. For the blunted portion of the geometry modified Newtonian theory, Fay-Riddell theory and Modified Reynolds analogy were applied. The design and analysis methods were validated using analytical solutions as well as empirical and numerical data. The streamline solution for the flow field generation technique was compared with a Taylor-Maccoll solution and showed very good agreement. The relationship between the local Stanton number and skin friction coefficient with local Reynolds number along the ideal portion of the body showed good agreement with experimental data. In addition, an automated grid generation routine was formulated to construct a structured mesh around resulting geometries in preparation for Computational Fluid Dynamics analysis. The overall analysis of the waverider body using the tool was then compared to CFD studies. The CFD flow field showed very good agreement with the design space. However, the distribution of the surface properties was near CFD results but did not have great agreement.

  16. Validation of designing tools as part of nuclear pump development process

    International Nuclear Information System (INIS)

    Klemm, T.; Sehr, F.; Spenner, P.; Fritz, J.

    2010-01-01

    Nuclear pumps are characterized by high safety standards, operational reliability as well as long life cycles. For the design process it is of common use to have a down scaled model pump to qualify operating data and simulate exceptional operating conditions. In case of modifications of the pump design compared to existing reactor coolant pumps a model pump is required to develop methods and tools to design the full scale pump. In the presented case it has a geometry scale of 1:2 regarding the full scale pump size. The experimental data of the model pump is basis for validation of methods and tools which are applied in the designing process of the full scale pump. In this paper the selection of qualified tools and the validation process is demonstrated exemplarily on a cooling circuit. The aim is to predict the resulting flow rate. Tools are chosen for different components depending on the benefit to effort ratio. For elementary flow phenomena such as fluid flow in straight pipes or gaps analytic or empirical laws can be used. For more complex flow situations numerical methods are utilized. Main focus is set on the validation process of the applied numerical flow simulation. In this case not only integral data should be compared, it is also necessary to validate local flow structure of numerical flow simulation to avoid systematic errors in CFD Model generation. Due to complex design internal flow measurements are not possible. On that reason simple comparisons of similar flow test cases are used. Results of this study show, that the flow simulation data closely match measured integral pump and test case data. With this validation it is now possible to qualify CFD simulations as a design tool for the full scale pump in similar cooling circuit. (authors)

  17. Microfabricated tools for manipulation and analysis of magnetic microcarriers

    International Nuclear Information System (INIS)

    Tondra, Mark; Popple, Anthony; Jander, Albrecht; Millen, Rachel L.; Pekas, Nikola; Porter, Marc D.

    2005-01-01

    Tools for manipulating and detecting magnetic microcarriers are being developed with microscale features. Microfabricated giant magnetoresistive (GMR) sensors and wires are used for detection, and for creating high local field gradients. Microfluidic structures are added to control flow, and positioning of samples and microcarriers. These tools are designed for work in analytical chemistry and biology

  18. Analytical developments in ICP-MS for arsenic and selenium speciation. Application to granitic waters

    International Nuclear Information System (INIS)

    Garraud, Herve

    1999-01-01

    Nuclear waste storage in geological areas needs the understanding of the physico-chemistry of groundwaters interactions with surrounding rocks. Redox potential measurements and speciation, calculated from geochemical modelling are not significant for the determination of water reactivity. We have thus chosen to carry out experimental speciation by developing sensitive analytical tools with respect of specie chemical identity. We have studied two redox indicators from reference sites (thermal waters from Pyrenees, France): arsenic and selenium. At first, we have determined the concentrations in major ions (sulphide, sulphate, chloride, fluoride, carbonate, Na, K, Ca). Speciation was conducted by HPLC hyphenated to quadrupole ICP-MS and high resolution ICP-MS. These analyses have shown the presence of two new arsenic species in solution, in addition of a great reactivity of these waters during stability studies. A sampling, storage and analysis method is described. (author) [fr

  19. Non-commutative tools for topological insulators

    International Nuclear Information System (INIS)

    Prodan, Emil

    2010-01-01

    This paper reviews several analytic tools for the field of topological insulators, developed with the aid of non-commutative calculus and geometry. The set of tools includes bulk topological invariants defined directly in the thermodynamic limit and in the presence of disorder, whose robustness is shown to have nontrivial physical consequences for the bulk states. The set of tools also includes a general relation between the current of an observable and its edge index, a relation that can be used to investigate the robustness of the edge states against disorder. The paper focuses on the motivations behind creating such tools and on how to use them.

  20. Evaluation of Four Different Analytical Tools to Determine the Regional Origin of Gastrodia elata and Rehmannia glutinosa on the Basis of Metabolomics Study

    Directory of Open Access Journals (Sweden)

    Dong-Kyu Lee

    2014-05-01

    Full Text Available Chemical profiles of medicinal plants could be dissimilar depending on the cultivation environments, which may influence their therapeutic efficacy. Accordingly, the regional origin of the medicinal plants should be authenticated for correct evaluation of their medicinal and market values. Metabolomics has been found very useful for discriminating the origin of many plants. Choosing the adequate analytical tool can be an essential procedure because different chemical profiles with different detection ranges will be produced according to the choice. In this study, four analytical tools, Fourier transform near‑infrared spectroscopy (FT-NIR, 1H-nuclear magnetic resonance spectroscopy (1H‑NMR, liquid chromatography-mass spectrometry (LC-MS, and gas chromatography-mass spectroscopy (GC-MS were applied in parallel to the same samples of two popular medicinal plants (Gastrodia elata and Rehmannia glutinosa cultivated either in Korea or China. The classification abilities of four discriminant models for each plant were evaluated based on the misclassification rate and Q2 obtained from principal component analysis (PCA and orthogonal projection to latent structures-discriminant analysis (OPLS‑DA, respectively. 1H-NMR and LC-MS, which were the best techniques for G. elata and R. glutinosa, respectively, were generally preferable for origin discrimination over the others. Reasoned by integrating all the results, 1H-NMR is the most prominent technique for discriminating the origins of two plants. Nonetheless, this study suggests that preliminary screening is essential to determine the most suitable analytical tool and statistical method, which will ensure the dependability of metabolomics-based discrimination.

  1. Review of the Development of Learning Analytics Applied in College-Level Institutes

    Directory of Open Access Journals (Sweden)

    Ken-Zen Chen

    2014-07-01

    Full Text Available This article focuses on the recent development of Learning Analytics using higher education institutional big-data. It addresses current state of Learning Analytics, creates a shared understanding, and clarifies misconceptions about the field. This article also reviews prominent examples from peer institutions that are conducting analytics, identifies their data and methodological framework, and comments on market vendors and non-for-profit initiatives. Finally, it suggests an implementation agenda for potential institutions and their stakeholders by drafting necessary preparations and creating iterative implementation flows.

  2. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    International Nuclear Information System (INIS)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work

  3. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  4. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    Science.gov (United States)

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  5. The Solid* toolset for software visual analytics of program structure and metrics comprehension : From research prototype to product

    NARCIS (Netherlands)

    Reniers, Dennie; Voinea, Lucian; Ersoy, Ozan; Telea, Alexandru

    2014-01-01

    Software visual analytics (SVA) tools combine static program analysis and fact extraction with information visualization to support program comprehension. However, building efficient and effective SVA tools is highly challenging, as it involves extensive software development in program analysis,

  6. Mapping healthcare systems: a policy relevant analytic tool.

    Science.gov (United States)

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  7. Analytical Tools for Functional Assessment of Architectural Layouts

    Science.gov (United States)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these

  8. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    Science.gov (United States)

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. ANALYTICAL, CRITICAL AND CREATIVE THINKING DEVELOPMENT OF THE GIFTED CHILDREN IN THE USA SCHOOLS

    Directory of Open Access Journals (Sweden)

    Anna Yurievna Kuvarzina

    2013-11-01

    Full Text Available Teachers of the gifted students should not only make an enrichment and acceleration program for them but also pay attention to the development of analytical, critical and creative thinking skills. Despite great interest for this issue in the last years, the topic of analytical and creative thinking is poorly considered in the textbooks for the gifted. In this article some methods, materials and programs of analytical, critical and creative thinking skills development, which are used in the USA, are described.  The author analyses and systematize the methods and also suggests some ways of their usage in the Russian educational system.Purpose: to analyze and systematize methods, materials and programs, that are used in the USA for teaching gifted children analytical, critical and creative thinking, for development of their capacities of problem-solving and decision-making. Methods and methodology of the research: analysis, comparison, principle of the historical and logical approaches unity.Results: positive results of employment of analytical, critical and creative thinking development methods were shown in the practical experience of teaching and educating gifted children in the USA educational system.Results employment field: the Russian Federation educational system: schools, special classes and courses for the gifted children.DOI: http://dx.doi.org/10.12731/2218-7405-2013-7-42

  10. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  11. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  12. Developing a 300C Analog Tool for EGS

    Energy Technology Data Exchange (ETDEWEB)

    Normann, Randy

    2015-03-23

    This paper covers the development of a 300°C geothermal well monitoring tool for supporting future EGS (enhanced geothermal systems) power production. This is the first of 3 tools planed. This is an analog tool designed for monitoring well pressure and temperature. There is discussion on 3 different circuit topologies and the development of the supporting surface electronics and software. There is information on testing electronic circuits and component. One of the major components is the cable used to connect the analog tool to the surface.

  13. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  14. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    Science.gov (United States)

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Development of Simulator Configuration Tool

    International Nuclear Information System (INIS)

    Nedrelid, Olav; Pettersen, Geir

    1996-01-01

    The main objective of the development of a Simulator Configuration Tool (SCT) is to achieve faster and more efficient production of dynamic simulators. Through application of versatile graphical interfaces, the simulator builder should be able to configure different types of simulators including full-scope process simulators. The SCT should be able to serve different simulator environments. The configuration tool communicates with simulator execution environments through a TCP/IP-based interface, Communication with a Model Server System developed at Institutt for energiteknikk has been established and used as test case. The system consists of OSF/Motif dialogues for operations requiring textual input, list selections etc., and uses the Picasso-3 User Interface Management System to handle presentation of static and dynamic graphical information. (author)

  16. Advanced web metrics with Google Analytics

    CERN Document Server

    Clifton, Brian

    2012-01-01

    Get the latest information about using the #1 web analytics tool from this fully updated guide Google Analytics is the free tool used by millions of web site owners to assess the effectiveness of their efforts. Its revised interface and new features will offer even more ways to increase the value of your web site, and this book will teach you how to use each one to best advantage. Featuring new content based on reader and client requests, the book helps you implement new methods and concepts, track social and mobile visitors, use the new multichannel funnel reporting features, understand which

  17. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  18. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    Science.gov (United States)

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  19. Coordinated experimental/analytical program for investigating margins to failure of Category I reinforced concrete structures

    International Nuclear Information System (INIS)

    Endebrock, E.; Dove, R.; Anderson, C.A.

    1981-01-01

    The material presented in this paper deals with a coordinated experimental/analytical program designed to provide information needed for making margins to failure assessments of seismic Category I reinforced concrete structures. The experimental program is emphasized and background information that lead to this particular experimental approach is presented. Analytical tools being developed to supplement the experimental program are discussed. 16 figures

  20. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    Science.gov (United States)

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and

  2. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Naser, J.A.

    1987-01-01

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  3. Selenium isotope studies in plants. Development and validation of a novel geochemical tool and its application to organic samples

    Energy Technology Data Exchange (ETDEWEB)

    Banning, Helena

    2016-03-12

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  4. Development of the major trauma case review tool.

    Science.gov (United States)

    Curtis, Kate; Mitchell, Rebecca; McCarthy, Amy; Wilson, Kellie; Van, Connie; Kennedy, Belinda; Tall, Gary; Holland, Andrew; Foster, Kim; Dickinson, Stuart; Stelfox, Henry T

    2017-02-28

    As many as half of all patients with major traumatic injuries do not receive the recommended care, with variance in preventable mortality reported across the globe. This variance highlights the need for a comprehensive process for monitoring and reviewing patient care, central to which is a consistent peer-review process that includes trauma system safety and human factors. There is no published, evidence-informed standardised tool that considers these factors for use in adult or paediatric trauma case peer-review. The aim of this research was to develop and validate a trauma case review tool to facilitate clinical review of paediatric trauma patient care in extracting information to facilitate monitoring, inform change and enable loop closure. Development of the trauma case review tool was multi-faceted, beginning with a review of the trauma audit tool literature. Data were extracted from the literature to inform iterative tool development using a consensus approach. Inter-rater agreement was assessed for both the pilot and finalised versions of the tool. The final trauma case review tool contained ten sections, including patient factors (such as pre-existing conditions), presenting problem, a timeline of events, factors contributing to the care delivery problem (including equipment, work environment, staff action, organizational factors), positive aspects of care and the outcome of panel discussion. After refinement, the inter-rater reliability of the human factors and outcome components of the tool improved with an average 86% agreement between raters. This research developed an evidence-informed tool for use in paediatric trauma case review that considers both system safety and human factors to facilitate clinical review of trauma patient care. This tool can be used to identify opportunities for improvement in trauma care and guide quality assurance activities. Validation is required in the adult population.

  5. Automation of analytical processes. A tool for higher efficiency and safety

    International Nuclear Information System (INIS)

    Groll, P.

    1976-01-01

    The analytical laboratory of a radiochemical facility is usually faced with the fact that numerous analyses of a similar type must be routinely carried out. Automation of such routine analytical procedures helps in increasing the efficiency and safety of the work. A review of the requirements for automation and its advantages is given and demonstrated on three examples. (author)

  6. Development of a core management tool for MYRRHA

    International Nuclear Information System (INIS)

    Jalůvka, David; Van den Eynde, Gert; Vandewalle, Stefan

    2013-01-01

    Highlights: • An in-core fuel management tool is being developed for the flexible irradiation machine MYRRHA. • Specific issues of the MYRRHA in-core fuel management are briefly discussed. • The tool addresses the loading pattern optimization problem. • Illustrative in-core fuel management optimization problems are solved using the tool. - Abstract: MYRRHA is an advanced multi-purpose irradiation facility under development at SCK• CEN in Mol, Belgium. In order to ensure an economical and safe operation of the reactor, an in-core fuel management tool is being developed within the project to address the loading pattern optimization problem. In the paper, the current version of the tool – its architecture and design, unique features, and the field of its application, are presented. In the second part of the paper, the tool’s capabilities are demonstrated on simple MYRRHA in-core fuel management optimization problems

  7. Development of computerized risk management tool

    International Nuclear Information System (INIS)

    Kil Yoo Kim; Mee Jung Hwang; Seung Cheol Jang; Sang Hoon Han; Tae Woon Kim

    1997-01-01

    The author describes the kinds of efforts for the development of computerized risk management tool; (1) development of a risk monitor, Risk Monster, (2) improvement of McFarm (Missing Cutsets Finding Algorithm for Risk Monitor) and finally (3) development of reliability database management system, KwDBMan. Risk Monster supports for plant operators and maintenance schedulers to monitor plant risk and to avoid high peak risk by rearranging maintenance work schedule. Improved McFarm significantly improved calculation speed of Risk Monster for the cases of supporting system OOS (Out Of Service). KwDBMan manages event data, generic data and CCF (Common Cause Failure) data to support Risk Monster as well as PSA tool, KIRAP (KAERI Integrated Reliability Analysis Package)

  8. Software development tools using GPGPU potentialities

    International Nuclear Information System (INIS)

    Dudnik, V.A.; Kudryavtsev, V.I.; Sereda, T.M.; Us, S.A.; Shestakov, M.V.

    2011-01-01

    The paper deals with potentialities of various up-to-date software development tools for making use of graphic processor (GPU) parallel computing resources. Examples are given to illustrate the use of present-day software tools for the development of applications and realization of algorithms for scientific-technical calculations performed by GPGPU. The paper presents some classes of hard mathematical problems of scientific-technical calculations, for which the GPGPU can be efficiently used. is possible. To reduce the time of calculation program development with the use of GPGPU capabilities, various dedicated programming systems and problem-oriented subroutine libraries are recommended. Performance parameters when solving the problems with and without the use of GPGPU potentialities are compared.

  9. Three-dimensional analytical field calculation of pyramidal-frustum shaped permanent magnets

    NARCIS (Netherlands)

    Janssen, J.L.G.; Paulides, J.J.H.; Lomonova, E.

    2009-01-01

    This paper presents a novel method to obtain fully analytical expressions of the magnetic field created by a pyramidal-frustum shaped permanent magnet. Conventional analytical tools only provide expressions for cuboidal permanent magnets and this paper extends these tools to more complex shapes. A

  10. Reducing Post-Decision Dissonance in International Decisions: The Analytic Hierarchy Process Approach.

    Science.gov (United States)

    DuBois, Frank L.

    1999-01-01

    Describes use of the analytic hierarchy process (AHP) as a teaching tool to illustrate the complexities of decision making in an international environment. The AHP approach uses managerial input to develop pairwise comparisons of relevant decision criteria to efficiently generate an appropriate solution. (DB)

  11. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  12. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  13. From corruption to state capture: A new analytical framework with empirical applications from Hungary

    OpenAIRE

    Fazekas, Mihaly; Tóth, István János

    2016-01-01

    State capture and corruption are widespread phenomena across the globe, but their empirical study still lacks sufficient analytical tools. This paper develops a new conceptual and analytical framework for gauging state capture based on micro-level contractual networks in public procurement. To this end, it establishes a novel measure of corruption risk in government contracting focusing on the behaviour of individual organisations. Then, it identifies clusters of high corruption risk organisa...

  14. Coastal On-line Assessment and Synthesis Tool 2.0

    Science.gov (United States)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  15. Developing an Evaluation Framework of Quality Indicators for Learning Analytics

    NARCIS (Netherlands)

    Scheffel, Maren; Drachsler, Hendrik; Specht, Marcus

    2017-01-01

    This paper presents results from the continuous process of developing an evaluation framework of quality indicators for learning analytics (LA). Building on a previous study, a group concept mapping approach that uses multidimensional scaling and hierarchical clustering, the study presented here

  16. Description of JNC's analytical method and its performance for FBR cores

    International Nuclear Information System (INIS)

    Ishikawa, M.

    2000-01-01

    The description of JNC's analytical method and its performance for FBR cores includes: an outline of JNC's Analytical System Compared with ERANOS; a standard data base for FBR Nuclear Design in JNC; JUPITER Critical Experiment; details of Analytical Method and Its Effects on JUPITER; performance of JNC Analytical System (effective multiplication factor k eff , control rod worth, and sodium void reactivity); design accuracy of a 600 MWe-class FBR Core. JNC developed a consistent analytical system for FBR core evaluation, based on JENDL library, f-table method, and three dimensional diffusion/transport theory, which includes comprehensive sensitivity tools to improve the prediction accuracy of core parameters. JNC system was verified by analysis of JUPITER critical experiment, and other facilities. Its performance can be judged quite satisfactory for FBR-core design work, though there is room for further improvement, such as more detailed treatment of cross-section resonance regions

  17. Exploring Higher Education Governance: Analytical Models and Heuristic Frameworks

    Directory of Open Access Journals (Sweden)

    Burhan FINDIKLI

    2017-08-01

    Full Text Available Governance in higher education, both at institutional and systemic levels, has experienced substantial changes within recent decades because of a range of world-historical processes such as massification, growth, globalization, marketization, public sector reforms, and the emergence of knowledge economy and society. These developments have made governance arrangements and decision-making processes in higher education more complex and multidimensional more than ever and forced scholars to build new analytical and heuristic tools and strategies to grasp the intricacy and diversity of higher education governance dynamics. This article provides a systematic discussion of how and through which tools prominent scholars of higher education have analyzed governance in this sector by examining certain heuristic frameworks and analytical models. Additionally, the article shows how social scientific analysis of governance in higher education has proceeded in a cumulative way with certain revisions and syntheses rather than radical conceptual and theoretical ruptures from Burton R. Clark’s seminal work to the present, revealing conceptual and empirical junctures between them.

  18. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  19. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas...... programming. However, in Software Engineering, software engineers who develop user interfaces do not follow it. In many cases, it is desirable to use graphical presentations, because a graphical presentation gives a better overview than text forms, and can improve task efficiency and user satisfaction....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...

  20. Software Development Methods and Tools: a New Zealand study

    Directory of Open Access Journals (Sweden)

    Chris Phillips

    2005-05-01

    Full Text Available This study is a more detailed follow-up to a preliminary investigation of the practices of software engineers in New Zealand. The focus of this study is on the methods and tools used by software developers in their current organisation. The project involved detailed questionnaires being piloted and sent out to several hundred software developers. A central part of the research involved the identification of factors affecting the use and take-up of existing software development tools in the workplace. The full spectrum of tools from fully integrated I-CASE tools to individual software applications, such as drawing tools was investigated. This paper describes the project and presents the findings.

  1. Healthcare predictive analytics: An overview with a focus on Saudi Arabia.

    Science.gov (United States)

    Alharthi, Hana

    2018-03-08

    Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  2. Recent developments in analytical toxicology : for better or for worse

    NARCIS (Netherlands)

    de Zeeuw, RA

    1998-01-01

    When considering the state of the art in toxicology from an analytical perspective, the key developments relate to three major areas. (1) Forensic horizon: Today forensic analysis has broadened its scope dramatically, to include workplace toxicology, drug abuse testing, drugs and driving, doping,

  3. Complete equation of state for shocked liquid nitrogen: Analytical developments

    International Nuclear Information System (INIS)

    Winey, J. M.; Gupta, Y. M.

    2016-01-01

    The thermodynamic response of liquid nitrogen has been studied extensively, in part, due to the long-standing interest in the high pressure and high temperature dissociation of shocked molecular nitrogen. Previous equation of state (EOS) developments regarding shocked liquid nitrogen have focused mainly on the use of intermolecular pair potentials in atomistic calculations. Here, we present EOS developments for liquid nitrogen, incorporating analytical models, for use in continuum calculations of the shock compression response. The analytical models, together with available Hugoniot data, were used to extrapolate a low pressure reference EOS for molecular nitrogen [Span, et al., J. Phys. Chem. Ref. Data 29, 1361 (2000)] to high pressures and high temperatures. Using the EOS presented here, the calculated pressures and temperatures for single shock, double shock, and multiple shock compression of liquid nitrogen provide a good match to the measured results over a broad range of P-T space. Our calculations provide the first comparison of EOS developments with recently-measured P-T states under multiple shock compression. The present EOS developments are general and are expected to be useful for other liquids that have low pressure reference EOS information available.

  4. Development of bore tools for pipe inspection

    Energy Technology Data Exchange (ETDEWEB)

    Oka, Kiyoshi; Nakahira, Masataka; Taguchi, Kou; Ito, Akira [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    In the International Thermonuclear Reactor (ITER), replacement and maintenance on in-vessel components requires that all cooling pipes connected be cut and removed, that a new component be installed, and that all cooling pipes be rewelded. After welding is completed, welded area must be inspected for soundness. These tasks require a new work concept for securing shielded area and access from narrow ports. Tools had to be developed for nondestructive inspection and leak testing to evaluate pipe welding soundness by accessing areas from inside pipes using autonomous locomotion welding and cutting tools. A system was proposed for nondestructive inspection of branch pipes and the main pipe after passing through pipe curves, the same as for welding and cutting tool development. Nondestructive inspection and leak testing sensors were developed and the basic parameters were obtained. In addition, the inspection systems which can move inside pipes and conduct the nondestructive inspection and the leak testing were developed. In this paper, an introduction will be given to the current situation concerning the development of nondestructive inspection and leak testing machines for the branch pipes. (author)

  5. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  6. Specialized case tools for the development of the accounting ...

    African Journals Online (AJOL)

    The paper presents an approach to building specialized CASE tools for the development of accounting applications. These tools form an integrated development environment allowing the computer aided development of the different applications in this field. This development environment consists of a formula interpreter, ...

  7. Jet substructure with analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, Mrinal [University of Manchester, Consortium for Fundamental Physics, School of Physics and Astronomy, Manchester (United Kingdom); Fregoso, Alessandro; Powling, Alexander [University of Manchester, School of Physics and Astronomy, Manchester (United Kingdom); Marzani, Simone [Durham University, Institute for Particle Physics Phenomenology, Durham (United Kingdom)

    2013-11-15

    We consider the mass distribution of QCD jets after the application of jet-substructure methods, specifically the mass-drop tagger, pruning, trimming and their variants. In contrast to most current studies employing Monte Carlo methods, we carry out analytical calculations at the next-to-leading order level, which are sufficient to extract the dominant logarithmic behaviour for each technique, and compare our findings to exact fixed-order results. Our results should ultimately lead to a better understanding of these jet-substructure methods which in turn will influence the development of future substructure tools for LHC phenomenology. (orig.)

  8. Mass spectrometry as a quantitative tool in plant metabolomics

    Science.gov (United States)

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  9. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  10. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  11. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  12. Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    OpenAIRE

    Guidi, Gabriele; Miniati, Roberto; Mazzola, Matteo; Iadanza, Ernesto

    2016-01-01

    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detect...

  13. Nuclear analytical chemistry: recent developments and applications

    International Nuclear Information System (INIS)

    Acharya, R.

    2013-01-01

    Recent R and D studies on Nuclear Analytical Chemistry utilizing techniques like Neutron Activation Analysis (NAA), Prompt Gamma-ray NAA (PGNAA), Particle Induced Gamma Ray and X-Ray Emission (PICE/PIXE) for compositional analysis of materials have been summarized. The work includes developments and applications of (i) single comparator NAA, called as k 0 -NAA, (ii) k 0 -based internal monostandard NAA (IM-NAA), (iii) k 0 -based prompt gamma ray NAA (PGNAA) and (iv) instrumental NAA using thermal and epithermal neutrons and (v) PIGE and PIXE methods using proton beam for low Z and medium Z elements, respectively. (author)

  14. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  15. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  16. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  17. License to evaluate: Preparing learning analytics dashboards for educational practice

    NARCIS (Netherlands)

    Jivet, Ioana; Scheffel, Maren; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    Learning analytics can bridge the gap between learning sciences and data analytics, leveraging the expertise of both fields in exploring the vast amount of data generated in online learning environments. A typical learning analytics intervention is the learning dashboard, a visualisation tool built

  18. Human Functions, Machine Tools, and the Role of the Analyst

    Directory of Open Access Journals (Sweden)

    Gordon R. Middleton

    2015-09-01

    Full Text Available In an era of rapidly increasing technical capability, the intelligence focus is often on the modes of collection and tools of analysis rather than the analyst themselves. Data are proliferating and so are tools to help analysts deal with the flood of data and the increasingly demanding timeline for intelligence production, but the role of the analyst in such a data-driven environment needs to be understood in order to support key management decisions (e.g., training and investment priorities. This paper describes a model of the analytic process, and analyzes the roles played by humans and machine tools in each process element. It concludes that human analytic functions are as critical in the intelligence process as they have ever been, and perhaps even more so due to the advance of technology in the intelligence business. Human functions performed by analysts are critical in nearly every step in the process, particularly at the front end of the analytic process, in defining and refining the problem statement, and at the end of the process, in generating knowledge, presenting the story in understandable terms, tailoring the presentation of the results of the analysis to various audiences, as well as in determining when to initiate iterative loops in the process. The paper concludes with observations on the necessity of enabling expert analysts, tools to deal with big data, developing analysts with advanced analytic methods as well as with techniques for optimal use of advanced tools, and suggestions for further quantitative research.

  19. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  20. The development, qualification and availability of AECL analytical, scientific and design codes

    International Nuclear Information System (INIS)

    Kupferschmidt, W.C.H.; Fehrenbach, P.J.; Wolgemuth, G.A.; McDonald, B.H.; Snell, V.G.

    2001-01-01

    Over the past several years, AECL has embarked on a comprehensive program to develop, qualify and support its key safety and licensing codes, and to make executable versions of these codes available to the international nuclear community. To this end, we have instituted a company-wide Software Quality Assurance (SQA) Program for Analytical, Scientific and Design Computer Programs to ensure that the design, development, maintenance, modification, procurement and use of computer codes within AECL is consistent with today's quality assurance standards. In addition, we have established a comprehensive Code Validation Project (CVP) with the goal of qualifying AECL's 'front-line' safety and licensing codes by 2001 December. The outcome of this initiative will be qualified codes, which are properly verified and validated for the expected range of applications, with associated statements of accuracy and uncertainty for each application. The code qualification program, based on the CSA N286.7 standard, is intended to ensure (1) that errors are not introduced into safety analyses because of deficiencies in the software, (2) that an auditable documentation base is assembled that demonstrates to the regulator that the codes are of acceptable quality, and (3) that these codes are formally qualified for their intended applications. Because AECL and the Canadian nuclear utilities (i.e., Ontario Power Generation, Bruce Power, Hydro Quebec and New Brunswick Power) generally use the same safety and licensing codes, the nuclear industry in Canada has agreed to work cooperatively together towards the development, qualification and maintenance of a common set of analysis tools, referred to as the Industry Standard Toolset (IST). This paper provides an overview of the AECL Software Quality Assurance Program and the Code Validation Project, and their associated linkages to the Canadian nuclear community's Industry Standard Toolset initiative to cooperatively qualify and support commonly

  1. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  2. Design and development of progressive tool for manufacturing washer

    Science.gov (United States)

    Annigeri, Ulhas K.; Raghavendra Ravi Kiran, K.; Deepthi, Y. P.

    2017-07-01

    In a progressive tool the raw material is worked at different station to finally fabricate the component. A progressive tool is a lucrative tool for mass production of components. A lot of automobile and other transport industries develop progressive tool for the production of components. The design of tool involves lot of planning and the same amount of skill of process planning is required in the fabrication of the tool. The design also involves use of thumb rules and standard elements as per experience gained in practice. Manufacturing the press tool is a laborious task as special jigs and fixtures have to be designed for the purpose. Assembly of all the press tool elements is another task where use of accurate measuring instruments for alignment of various tool elements is important. In the present study, design and fabrication of progressive press tool for production of washer has been developed and the press tool has been tried out on a mechanical type of press. The components produced are to dimensions.

  3. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    International Nuclear Information System (INIS)

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Wahl, K.; Steele, R.

    1994-09-01

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY)

  4. Development and validation of analytical methods for dietary supplements

    International Nuclear Information System (INIS)

    Sullivan, Darryl; Crowley, Richard

    2006-01-01

    The expanding use of innovative botanical ingredients in dietary supplements and foods has resulted in a flurry of research aimed at the development and validation of analytical methods for accurate measurement of active ingredients. The pressing need for these methods is being met through an expansive collaborative initiative involving industry, government, and analytical organizations. This effort has resulted in the validation of several important assays as well as important advances in the method engineering procedures which have improved the efficiency of the process. The initiative has also allowed researchers to hurdle many of the barricades that have hindered accurate analysis such as the lack of reference standards and comparative data. As the availability for nutraceutical products continues to increase these methods will provide consumers and regulators with the scientific information needed to assure safety and dependable labeling

  5. Development of RCM methodology and tools for EDF nuclear power plants

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Bouchet, J.L.; Despujols, A.; Dewailly, J.; Martin-Mattei, C.

    1995-01-01

    In 1990, EDF launched a Reliability-Centered Maintenance project for its nuclear plants. This 'OMF' project aims at developing methods and tools for analysis and in the first phase, applying these to one initial system (the pilot study). The results of the pilot study have confirmed the advantages of the 'OMF' analytical method: the prospects for the approach on an industrial scale are extremely promising. It should be noted that the precision of our 'OMF' analysis is not doubt superior to that common in other industrial domains (MSG/RCM analysis). The particular approach implies analysis of systems and components and, most importantly, integration of operation feedback, with a view to developing a rigorous maintenance program which can constantly be updated. In addition to the defining and implementing the method, the review of designing software aids has begun. The pilot study clearly pointed up the need for such aids in handling the necessary volume of information and assisting experts in their analysis. The EDF 'OMF' workstation (and its environment) will be used not only in preparing the 'initial' maintenance program but also in updating it during the 'living' program phase. (author)

  6. Development of RCM methodology and tools for EDF nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jacquot, J.P.; Bouchet, J.L.; Despujols, A.; Dewailly, J.; Martin-Mattei, C. [Electricite de France, 78 - Chatou (France)

    1995-12-31

    In 1990, EDF launched a Reliability-Centered Maintenance project for its nuclear plants. This `OMF` project aims at developing methods and tools for analysis and in the first phase, applying these to one initial system (the pilot study). The results of the pilot study have confirmed the advantages of the `OMF` analytical method: the prospects for the approach on an industrial scale are extremely promising. It should be noted that the precision of our `OMF` analysis is not doubt superior to that common in other industrial domains (MSG/RCM analysis). The particular approach implies analysis of systems and components and, most importantly, integration of operation feedback, with a view to developing a rigorous maintenance program which can constantly be updated. In addition to the defining and implementing the method, the review of designing software aids has begun. The pilot study clearly pointed up the need for such aids in handling the necessary volume of information and assisting experts in their analysis. The EDF `OMF` workstation (and its environment) will be used not only in preparing the `initial` maintenance program but also in updating it during the `living` program phase. (author) 4 refs.

  7. Demonstrating Success: Web Analytics and Continuous Improvement

    Science.gov (United States)

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  8. An Exploratory Study to Assess Analytical and Logical Thinking Skills of the Software Practitioners using a Gamification Perspective

    Directory of Open Access Journals (Sweden)

    Şahin KAYALI

    2016-12-01

    Full Text Available The link between analytical and logical thinking skills and success of software practitioners attracted an increasing attention in the last decade. Several studies report that the ability to think logically is a requirement for improving software development skills, which exhibits a strong reasoning. Additionally, analytical thinking is a vital part of software development for example while dividing a task into elemental parts with respect to basic rules and principles.  Using the basic essence of gamification, this study proposes a mobile testing platform for assessing analytical and logical thinking skills of software practitioners as well as computer engineering students. The assessment questions were taken from the literature and transformed into a gamified tool based on the software requirements. A focus group study was conducted to capture the requirements. Using the Delphi method, these requirements were discussed by a group of experts to reach a multidisciplinary understanding where a level of moderate agreement has been achieved. In light of these, an assessment tool was developed, which was tested on both software practitioners from the industry and senior computer engineering students. Our results suggest that individuals who exhibit skills in analytical and logical thinking are also more inclined to be successful in software development.

  9. Development of a Boundary Layer Property Interpolation Tool in Support of Orbiter Return To Flight

    Science.gov (United States)

    Greene, Francis A.; Hamilton, H. Harris

    2006-01-01

    A new tool was developed to predict the boundary layer quantities required by several physics-based predictive/analytic methods that assess damaged Orbiter tile. This new tool, the Boundary Layer Property Prediction (BLPROP) tool, supplies boundary layer values used in correlations that determine boundary layer transition onset and surface heating-rate augmentation/attenuation factors inside tile gouges (i.e. cavities). BLPROP interpolates through a database of computed solutions and provides boundary layer and wall data (delta, theta, Re(sub theta)/M(sub e), Re(sub theta)/M(sub e), Re(sub theta), P(sub w), and q(sub w)) based on user input surface location and free stream conditions. Surface locations are limited to the Orbiter s windward surface. Constructed using predictions from an inviscid w/boundary-layer method and benchmark viscous CFD, the computed database covers the hypersonic continuum flight regime based on two reference flight trajectories. First-order one-dimensional Lagrange interpolation accounts for Mach number and angle-of-attack variations, whereas non-dimensional normalization accounts for differences between the reference and input Reynolds number. Employing the same computational methods used to construct the database, solutions at other trajectory points taken from previous STS flights were computed: these results validate the BLPROP algorithm. Percentage differences between interpolated and computed values are presented and are used to establish the level of uncertainty of the new tool.

  10. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  11. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  12. Information technology tools for curriculum development

    NARCIS (Netherlands)

    McKenney, Susan; Nieveen, N.M.; Strijker, A.; Voogt, Joke; Knezek, Gerald

    2008-01-01

    The widespread introduction and use of computers in the workplace began in the early 1990s. Since then, computer-based tools have been developed to support a myriad of task types, including the complex process of curriculum development. This chapter begins by briefly introducing two concepts that

  13. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    Science.gov (United States)

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  14. Physics-Based Probabilistic Design Tool with System-Level Reliability Constraint, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The work proposed herein would develop a set of analytic methodologies and a computer tool suite enabling aerospace hardware designers to rapidly determine optimum...

  15. Development of a tool for evaluating multimedia for surgical education.

    Science.gov (United States)

    Coughlan, Jane; Morar, Sonali S

    2008-09-01

    Educational multimedia has been designed to provide surgical trainees with expert operative information outside of the operating theater. The effectiveness of multimedia (e.g., CD-ROMs) for learning has been a common research topic since the 1990s. To date, however, little discussion has taken place on the mechanisms to evaluate the quality of multimedia-driven teaching. This may be because of a lack of research into the development of appropriate tools for evaluating multimedia, especially for surgical education. This paper reports on a small-scale pilot and exploratory study (n = 12) that developed a tool for surgical multimedia evaluation. The validity of the developed tool was established through adaptation of an existing tool, which was reviewed using experts in surgery, usability, and education. The reliability of the developed tool was tested with surgical trainees who used it to assess a multimedia CD-ROM created for teaching basic surgical skills. The findings contribute to an understanding of surgical trainees' experience of using educational multimedia, in terms of characteristics of the learning material for interface design and content and the process of developing evaluation tools, in terms of inclusion of appropriate assessment criteria. The increasing use of multimedia in medical education necessitates the development of standardized tools for determining the quality of teaching and learning. Little research exists into the development of such tools and so the present work stimulates discussion on how to evaluate surgical training.

  16. Water Loss Management: Tools and Methods for Developing Countries

    OpenAIRE

    Mutikanga, H.E.

    2012-01-01

    Water utilities in developing countries are struggling to provide customers with a reliable level of service due to their peculiar water distribution characteristics including poorly zoned networks with irregular supply operating under restricted budgets. These unique conditions demand unique tools and methods for water loss control. Water loss management: Tools and Methods for Developing Countries provide a decision support toolbox (appropriate tools and methodologies) for assessing, quantif...

  17. ICT tools for curriculum development

    NARCIS (Netherlands)

    McKenney, Susan; Nieveen, N.M.; van den Akker, J.J.H.; Kuiper, W.J.A.M.; Hameyer, U.

    2003-01-01

    Along with others in this book, this chapter examines a recent trend in curriculum development, namely, employing the computer to support this complex process. Not to be confused with the vast majority of ICT tools for education, which support the teachers and learners more directly, this discussion

  18. Long range manipulator development and experiments with dismantling tools

    International Nuclear Information System (INIS)

    Mueller, K.

    1993-01-01

    An existing handling system (EMIR) was used as a carrier system for various tools for concrete dismantling and radiation protection monitoring. It combined the advantages of long reach and high payload with highly dexterous kinematics. This system was enhanced mechanically to allow the use of different tools. Tool attachment devices for automatic tool exchange were investigated as well as interfaces (electric, hydraulic, compressed air, cooling water and signals). The control system was improved with regard to accuracy and sensor data processing. Programmable logic controller functions for tool control were incorporated. A free field mockup of the EMIR was build that allowed close simulation of dismantling scenarios without radioactive inventory. Aged concrete was provided for the integration tests. The development scheduled included the basic concept investigation; the development of tools and sensors; the EMIR hardware enhancement including a tool exchange; the adaption of tools and mockup and the final evaluation of the system during experiments

  19. The analytic hierarchy process as a support for decision making

    Directory of Open Access Journals (Sweden)

    Filipović Milanka

    2007-01-01

    Full Text Available The first part of this text deals with a convention site selection as one of the most lucrative areas in the tourism industry. The second part gives a further description of a method for decision making - the analytic hierarchy process. The basic characteristics: hierarchy constructions and pair wise comparison on the given level of the hierarchy are allured. The third part offers an example of application. This example is solved using the Super - Decision software, which is developed as a computer support for the analytic hierarchy process. This indicates that the AHP approach is a useful tool to help support a decision of convention site selection. .

  20. Explanatory analytics in OLAP

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.

    2013-01-01

    In this paper the authors describe a method to integrate explanatory business analytics in OLAP information systems. This method supports the discovery of exceptional values in OLAP data and the explanation of such values by giving their underlying causes. OLAP applications offer a support tool for

  1. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  2. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  3. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  4. Learn Xcode Tools for Mac OS X and iPhone Development

    CERN Document Server

    Piper, I

    2010-01-01

    This book will give you a thorough grounding in the principal and supporting tools and technologies that make up the Xcode Developer Tools suite. Apple has provided a comprehensive collection of developer tools, and this is the first book to examine the complete Apple programming environment for both Mac OS X and iPhone. * Comprehensive coverage of all the Xcode developer tools * Additional coverage of useful third-party development tools* Not just a survey of features, but a serious examination of the complete development process for Mac OS X and iPhone applications What you'll learn* The boo

  5. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    Science.gov (United States)

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  6. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    Science.gov (United States)

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of 0.05).

  7. Employability Skills Assessment Tool Development

    Science.gov (United States)

    Rasul, Mohamad Sattar; Rauf, Rose Amnah Abd; Mansor, Azlin Norhaini; Puvanasvaran, A. P.

    2012-01-01

    Research nationally and internationally found that technical graduates are lacking in employability skills. As employability skills are crucial in outcome-based education, the main goal of this research is to develop an Employability Skill Assessment Tool to help students and lecturers produce competent graduates in employability skills needed by…

  8. DIDADTIC TOOLS FOR THE STUDENTS’ ALGORITHMIC THINKING DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    T. P. Pushkaryeva

    2017-01-01

    Full Text Available Introduction. Modern engineers must possess high potential of cognitive abilities, in particular, the algorithmic thinking (AT. In this regard, the training of future experts (university graduates of technical specialities has to provide the knowledge of principles and ways of designing of various algorithms, abilities to analyze them, and to choose the most optimal variants for engineering activity implementation. For full formation of AT skills it is necessary to consider all channels of psychological perception and cogitative processing of educational information: visual, auditory, and kinesthetic.The aim of the present research is theoretical basis of design, development and use of resources for successful development of AT during the educational process of training in programming.Methodology and research methods. Methodology of the research involves the basic thesis of cognitive psychology and information approach while organizing the educational process. The research used methods: analysis; modeling of cognitive processes; designing training tools that take into account the mentality and peculiarities of information perception; diagnostic efficiency of the didactic tools. Results. The three-level model for future engineers training in programming aimed at development of AT skills was developed. The model includes three components: aesthetic, simulative, and conceptual. Stages to mastering a new discipline are allocated. It is proved that for development of AT skills when training in programming it is necessary to use kinesthetic tools at the stage of mental algorithmic maps formation; algorithmic animation and algorithmic mental maps at the stage of algorithmic model and conceptual images formation. Kinesthetic tools for development of students’ AT skills when training in algorithmization and programming are designed. Using of kinesthetic training simulators in educational process provide the effective development of algorithmic style of

  9. The Navier-Stokes equations an elementary functional analytic approach

    CERN Document Server

    Sohr, Hermann

    2001-01-01

    The primary objective of this monograph is to develop an elementary and self­ contained approach to the mathematical theory of a viscous incompressible fluid in a domain 0 of the Euclidean space ]Rn, described by the equations of Navier­ Stokes. The book is mainly directed to students familiar with basic functional analytic tools in Hilbert and Banach spaces. However, for readers' convenience, in the first two chapters we collect without proof some fundamental properties of Sobolev spaces, distributions, operators, etc. Another important objective is to formulate the theory for a completely general domain O. In particular, the theory applies to arbitrary unbounded, non-smooth domains. For this reason, in the nonlinear case, we have to restrict ourselves to space dimensions n = 2,3 that are also most significant from the physical point of view. For mathematical generality, we will develop the lin­ earized theory for all n 2 2. Although the functional-analytic approach developed here is, in principle, known ...

  10. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  11. Risk assessment using Analytical Hierarchy Process - Development and evaluation of a new computer-based tool; Riskvaerdering med Analytical Hierarchy Process - Utveckling och utprovning av ett nytt datorbaserat verktyg

    Energy Technology Data Exchange (ETDEWEB)

    Ritchey, Tom (Swedish Defence Research Agency, Stockholm (Sweden))

    2008-11-15

    Risk analysis concerning the management of contaminated areas involves comparing and evaluating the relationship between ecological, technical, economic and other factors, in order to determine a reasonable level of remediation. Risk analysis of this kind is a relatively new phenomenon. In order to develop methodology in this area, the Sustainable Remediation program contributes both to comprehensive risk analysis projects and to projects concentrating on specific aspects of remediation risk analysis. In the project described in this report, the Swedish Defence Research Agency (FOI) was given a grant by the Sustainable Remediation program to apply the Analytic Hierarchy Process (AHP) in order to develop a computer-aided instrument to support remediation risk analysis. AHP is one of several so-called multi-criteria decision support methods. These methods are applied in order to systematically compare and evaluate different solutions or measures, when there are many different goal criteria involved. Such criteria can be both quantitative and qualitative. The project has resulted in the development of a computer-aided instrument which can be employed to give a better structure, consistency and traceability to risk analyses for the remediation of contaminated areas. Project was carried out in two phases with two different working groups. The first phase involved the development of a generic base-model for remediation risk analysis. This was performed by a 'development group'. The second phase entailed the testing of the generic model in a specific, on-going remediation project. This was performed by a 'test group'. The remediation project in question concerned the decontamination of a closed-down sawmill in Vaeckelsaang, in the Swedish municipality of Tingsryd

  12. European Institutional and Organisational Tools for Maritime Human Resources Development

    OpenAIRE

    Dragomir Cristina

    2012-01-01

    Seafarers need to continuously develop their career, at all stages of their professional life. This paper presents some tools of institutional and organisational career development. At insitutional level there are presented vocational education and training tools provided by the European Union institutions while at organisational level are exemplified some tools used by private crewing companies for maritime human resources assessment and development.

  13. The development of a practical tool for risk assessment of manual work – the HAT-tool

    NARCIS (Netherlands)

    Kraker, H. de; Douwes, M.

    2008-01-01

    For the Dutch Ministry of Social Affairs and Employment we developed a tool to assess the risks of developing complaints of the arm, neck or shoulders during manual work. The tool was developed for every type of organization and is easy to use, does not require measurements other than time and can

  14. Accept or Decline? An Analytics-Based Decision Tool for Kidney Offer Evaluation.

    Science.gov (United States)

    Bertsimas, Dimitris; Kung, Jerry; Trichakis, Nikolaos; Wojciechowski, David; Vagefi, Parsia A

    2017-12-01

    When a deceased-donor kidney is offered to a waitlisted candidate, the decision to accept or decline the organ relies primarily upon a practitioner's experience and intuition. Such decisions must achieve a delicate balance between estimating the immediate benefit of transplantation and the potential for future higher-quality offers. However, the current experience-based paradigm lacks scientific rigor and is subject to the inaccuracies that plague anecdotal decision-making. A data-driven analytics-based model was developed to predict whether a patient will receive an offer for a deceased-donor kidney at Kidney Donor Profile Index thresholds of 0.2, 0.4, and 0.6, and at timeframes of 3, 6, and 12 months. The model accounted for Organ Procurement Organization, blood group, wait time, DR antigens, and prior offer history to provide accurate and personalized predictions. Performance was evaluated on data sets spanning various lengths of time to understand the adaptability of the method. Using United Network for Organ Sharing match-run data from March 2007 to June 2013, out-of-sample area under the receiver operating characteristic curve was approximately 0.87 for all Kidney Donor Profile Index thresholds and timeframes considered for the 10 most populous Organ Procurement Organizations. As more data becomes available, area under the receiver operating characteristic curve values increase and subsequently level off. The development of a data-driven analytics-based model may assist transplant practitioners and candidates during the complex decision of whether to accept or forgo a current kidney offer in anticipation of a future high-quality offer. The latter holds promise to facilitate timely transplantation and optimize the efficiency of allocation.

  15. Embedded Systems Development Tools: A MODUS-oriented Market Overview

    Directory of Open Access Journals (Sweden)

    Loupis Michalis

    2014-03-01

    Full Text Available Background: The embedded systems technology has perhaps been the most dominating technology in high-tech industries, in the past decade. The industry has correctly identified the potential of this technology and has put its efforts into exploring its full potential. Objectives: The goal of the paper is to explore the versatility of the application in the embedded system development based on one FP7-SME project. Methods/Approach: Embedded applications normally demand high resilience and quality, as well as conformity to quality standards and rigid performance. As a result embedded system developers have adopted software methods that yield high quality. The qualitative approach to examining embedded systems development tools has been applied in this work. Results: This paper presents a MODUS-oriented market analysis in the domains of Formal Verification tools, HW/SW co-simulation tools, Software Performance Optimization tools and Code Generation tools. Conclusions: The versatility of applications this technology serves is amazing. With all this performance potential, the technology has carried with itself a large number of issues which the industry essentially needs to resolve to be able to harness the full potential contained. The MODUS project toolset addressed four discrete domains of the ESD Software Market, in which corresponding open tools were developed

  16. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    Science.gov (United States)

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  17. EnergiTools(R) - a power plant performance monitoring and diagnosis tool

    International Nuclear Information System (INIS)

    Ancion, P.V.; Bastien, R.; Ringdahl, K.

    2000-01-01

    Westinghouse EnergiTools(R) is a performance diagnostic tool for power generation plants that combines the power of on-line process data acquisition with advanced diagnostics methodologies. The system uses analytical models based on thermodynamic principles combined with knowledge of component diagnostic experts. An issue in modeling expert knowledge is to have a framework that can represent and process uncertainty in complex systems. In such experiments, it is nearly impossible to build deterministic models for the effects of faults on symptoms. A methodology based on causal probabilistic graphs, more specifically on Bayesian belief networks, has been implemented in EnergiTools(R) to capture the fault-symptom relationships. The methodology estimates the likelihood of the various component failures using the fault-symptom relationships. The system also has the ability to use neural networks for processes that are difficult to model analytically. An application is the estimation of the reactor power in nuclear power plant by interpreting several plant indicators. EnergiTools(R) is used for the on-line performance monitoring and diagnostics at Vattenfall Ringhals nuclear power plants in Sweden. It has led to the diagnosis of various performance issues with plant components. Two case studies are presented. In the first case, an overestimate of the thermal power due to a faulty instrument was found, which led to a plant operation below its optimal power. The paper shows how the problem was discovered, using the analytical thermodynamic calculations. The second case shows an application of EnergiTools(R) for the diagnostic of a condenser failure using causal probabilistic graphs

  18. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  19. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    Karanassios, Vassili

    2004-01-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  20. Seductive Atmospheres: Using tools to effectuate spaces for Leadership Development

    DEFF Research Database (Denmark)

    Elmholdt, Kasper Trolle; Clausen, Rune Thorbjørn; Madsen, Mona T

    2018-01-01

    Hospital, this study investigates how a business game is used as a tool to effectuate episodic spaces for leadership development. The study reveals three tool affordances and discusses how they enable and constrain episodic spaces for development and further develops the notion of seductive atmospheres......This study applies an affordance lens to understand the use of management tools and how atmospheres for change and development are created and exploited. Drawing on an ethnographic case study of a consultant-facilitated change intervention among a group of research leaders at a Danish Public...... as an important mechanism. The article suggests that a broader understanding of the use of tools and the role of atmospheres is essential for understanding how episodic spaces for development come to work in relation to organizational change and development....

  1. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  2. Automated Deployment of Advanced Controls and Analytics in Buildings

    Science.gov (United States)

    Pritoni, Marco

    Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.

  3. ANALYTICAL ANARCHISM: THE PROBLEM OF DEFINITION AND DEMARCATION

    OpenAIRE

    Konstantinov M.S.

    2012-01-01

    In this paper the first time in the science of our country is considered a new trend of anarchist thought - analytical anarchism. As a methodological tool used critical analysis of the key propositions of the basic versions of this trend: the anarcho- capitalist and egalitarian. The study was proposed classification of discernible trends within the analytical anarchism on the basis of value criteria, identified conceptual and methodological problems of definition analytical anarchism and its ...

  4. THz spectroscopy: An emerging technology for pharmaceutical development and pharmaceutical Process Analytical Technology (PAT) applications

    Science.gov (United States)

    Wu, Huiquan; Khan, Mansoor

    2012-08-01

    As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.

  5. Analytic functions of several complex variables

    CERN Document Server

    Gunning, Robert C

    2009-01-01

    The theory of analytic functions of several complex variables enjoyed a period of remarkable development in the middle part of the twentieth century. After initial successes by Poincaré and others in the late 19th and early 20th centuries, the theory encountered obstacles that prevented it from growing quickly into an analogue of the theory for functions of one complex variable. Beginning in the 1930s, initially through the work of Oka, then H. Cartan, and continuing with the work of Grauert, Remmert, and others, new tools were introduced into the theory of several complex variables that resol

  6. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    Science.gov (United States)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  7. Dcode.org anthology of comparative genomic tools.

    Science.gov (United States)

    Loots, Gabriela G; Ovcharenko, Ivan

    2005-07-01

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the non-coding encryption of gene regulation across genomes. To facilitate the practical application of comparative sequence analysis to genetics and genomics, we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools, zPicture and Mulan; a phylogenetic shadowing tool, eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools, rVista and multiTF; a tool for extracting cis-regulatory modules governing the expression of co-regulated genes, Creme 2.0; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here, we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ website.

  8. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  9. The development of a post occupancy evaluation tool for primary schools: learner comfort assessment tool (LCAT)

    CSIR Research Space (South Africa)

    Motsatsi, L

    2015-12-01

    Full Text Available in order to facilitate teaching and learning. The aim of this study was to develop a Post Occupational Evaluation (POE) tool to assess learner comfort in relation to indoor environmental quality in the classroom. The development of POE tool followed a...

  10. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    Science.gov (United States)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  11. Some key issues in the development of ergonomic intervention tools

    DEFF Research Database (Denmark)

    Edwards, Kasper; Winkel, Jørgen

    2016-01-01

    Literature reviews suggest that tools facilitating the ergonomic intervention processes should be integrated into rationalization tools, particular if such tools are participative. Such a Tool has recently been developed as an add-in module to the Lean tool “Value Stream Mapping” (VSM). However...

  12. Fernald Silo Remote Retrieval Tool Development

    International Nuclear Information System (INIS)

    Varma, V.K.

    2004-01-01

    A long-reach tool was developed to remove discrete objects from the silos at the Fernald Environmental Management Project in Ohio. If they are not removed, these objects can potentially cause problems during the retrieval and transfer of waste from the silos. Most of the objects are on top of the Bentogrout cap inside the silos at or near the primary opening into the tank and will therefore require only vertical lifting. The objects are located about 20 ft from the top of the silo. Although most of the objects can be retrieved from 20 ft, the long-reach tool was designed to for a reach up to 40 ft in case objects roll towards the walls of the tank or need to be removed during heel retrieval operations. This report provides a detailed description of the tool that was developed, tested, and demonstrated at the Tanks Technology Cold Test Facility at Oak Ridge National Laboratory. Scaffolding was erected over two experimental cells to simulate the 40-ft maximum working depth anticipated in the silos at Fernald. Plastic bottles and plastic sheeting simulated the debris that could be encountered during waste retrieval operations

  13. Open source tools for ATR development and performance evaluation

    Science.gov (United States)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  14. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976

    International Nuclear Information System (INIS)

    Baker, R.D.

    1976-08-01

    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described

  15. Modeling of the Global Water Cycle - Analytical Models

    Science.gov (United States)

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  16. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  17. Ongoing development of digital radiotherapy plan review tools

    International Nuclear Information System (INIS)

    Ebert, M.A.; Hatton, J.; Cornes, D.

    2011-01-01

    Full text: To describe ongoing development of software to support the review of radiotherapy treatment planning system (TPS) data. The 'SWAN' software program was conceived in 2000 and initially developed for the RADAR (TROG 03.04) prostate radiotherapy trial. Validation of the SWAN program has been occurring via implementation by TROG in support of multiple clinical trials. Development has continued and the SWAN software program is now supported by modular components which comprise the 'SW AN system'. This provides a comprehensive set of tools for the review, analysis and archive of TPS exports. The SWAN system has now been used in support of over 20 radiotherapy trials and to review the plans of over 2,000 trial participants. The use of the system for the RADAR trial is now culminating in the derivation of dose-outcomes indices for prostate treatment toxicity. Newly developed SWAN tools include enhanced remote data archive/retrieval, display of dose in both relative and absolute modes, and interfacing to a Matlab-based add-on ('VAST') that allows quantitative analysis of delineated volumes including regional overlap statistics for multi-observer studies. Efforts are continuing to develop the SWAN system in the context of international collaboration aimed at harmonising the quality-assurance activities of collaborative trials groups. Tools such as the SWAN system are essential for ensuring the collection of accurate and reliable evidence to guide future radiotherapy treatments. One of the principal challenges of developing such a tool is establishing a development path that will ensure its validity and applicability well into the future.

  18. Room temperature phosphorescence in the liquid state as a tool in analytical chemistry

    International Nuclear Information System (INIS)

    Kuijt, Jacobus; Ariese, Freek; Brinkman, Udo A.Th.; Gooijer, Cees

    2003-01-01

    A wide-ranging overview of room temperature phosphorescence in the liquid state (RTPL ) is presented, with a focus on recent developments. RTPL techniques like micelle-stabilized (MS)-RTP, cyclodextrin-induced (CD)-RTP, and heavy atom-induced (HAI)-RTP are discussed. These techniques are mainly applied in the stand-alone format, but coupling with some separation techniques appears to be feasible. Applications of direct, sensitized and quenched phosphorescence are also discussed. As regards sensitized and quenched RTP, emphasis is on the coupling with liquid chromatography (LC) and capillary electrophoresis (CE), but stand-alone applications are also reported. Further, the application of RTPL in immunoassays and in RTP optosensing - the optical sensing of analytes based on RTP - is reviewed. Next to the application of RTPL in quantitative analysis, its use for the structural probing of protein conformations and for time-resolved microscopy of labelled biomolecules is discussed. Finally, an overview is presented of the various analytical techniques which are based on the closely related phenomenon of long-lived lanthanide luminescence. The paper closes with a short evaluation of the state-of-the-art in RTP and a discussion on future perspectives

  19. Cluster Analysis as an Analytical Tool of Population Policy

    Directory of Open Access Journals (Sweden)

    Oksana Mikhaylovna Shubat

    2017-12-01

    Full Text Available The predicted negative trends in Russian demography (falling birth rates, population decline actualize the need to strengthen measures of family and population policy. Our research purpose is to identify groups of Russian regions with similar characteristics in the family sphere using cluster analysis. The findings should make an important contribution to the field of family policy. We used hierarchical cluster analysis based on the Ward method and the Euclidean distance for segmentation of Russian regions. Clustering is based on four variables, which allowed assessing the family institution in the region. The authors used the data of Federal State Statistics Service from 2010 to 2015. Clustering and profiling of each segment has allowed forming a model of Russian regions depending on the features of the family institution in these regions. The authors revealed four clusters grouping regions with similar problems in the family sphere. This segmentation makes it possible to develop the most relevant family policy measures in each group of regions. Thus, the analysis has shown a high degree of differentiation of the family institution in the regions. This suggests that a unified approach to population problems’ solving is far from being effective. To achieve greater results in the implementation of family policy, a differentiated approach is needed. Methods of multidimensional data classification can be successfully applied as a relevant analytical toolkit. Further research could develop the adaptation of multidimensional classification methods to the analysis of the population problems in Russian regions. In particular, the algorithms of nonparametric cluster analysis may be of relevance in future studies.

  20. Nanometrology, Standardization and Regulation of Nanomaterials in Brazil: A Proposal for an Analytical-Prospective Model

    Directory of Open Access Journals (Sweden)

    Ana Rusmerg Giménez Ledesma

    2013-05-01

    Full Text Available The main objective of this paper is to propose an analytical-prospective model as a tool to support decision-making processes concerning metrology, standardization and regulation of nanomaterials in Brazil, based on international references and ongoing initiatives in the world. In the context of nanotechnology development in Brazil, the motivation for carrying out this research was to identify potential benefits of metrology, standardization and regulation of nanomaterials production, from the perspective of future adoption of the model by the main stakeholders of development of these areas in Brazil. The main results can be summarized as follows: (i an overview of international studies on metrology, standardization and regulation of nanomaterials, and nanoparticles, in special; (ii the analytical-prospective model; and (iii the survey questionnaire and the roadmapping tool for metrology, standardization and regulation of nanomaterials in Brazil, based on international references and ongoing initiatives in the world.

  1. Quality Assurance Project Plan Development Tool

    Science.gov (United States)

    This tool contains information designed to assist in developing a Quality Assurance (QA) Project Plan that meets EPA requirements for projects that involve surface or groundwater monitoring and/or the collection and analysis of water samples.

  2. Comprehension of complex biological processes by analytical methods: how far can we go using mass spectrometry?

    International Nuclear Information System (INIS)

    Gerner, C.

    2013-01-01

    Comprehensive understanding of complex biological processes is the basis for many biomedical issues of great relevance for modern society including risk assessment, drug development, quality control of industrial products and many more. Screening methods provide means for investigating biological samples without research hypothesis. However, the first boom of analytical screening efforts has passed and we again need to ask whether and how to apply screening methods. Mass spectrometry is a modern tool with unrivalled analytical capacities. This applies to all relevant characteristics of analytical methods such as specificity, sensitivity, accuracy, multiplicity and diversity of applications. Indeed, mass spectrometry qualifies to deal with complexity. Chronic inflammation is a common feature of almost all relevant diseases challenging our modern society; these diseases are apparently highly diverse and include arteriosclerosis, cancer, back pain, neurodegenerative diseases, depression and other. The complexity of mechanisms regulating chronic inflammation is the reason for the practical challenge to deal with it. The presentation shall give an overview of capabilities and limitations of the application of this analytical tool to solve critical questions with great relevance for our society. (author)

  3. Effectiveness of operation tools developed by KEKB operators

    International Nuclear Information System (INIS)

    Sugino, K.; Satoh, Y.; Kitabayashi, T.

    2004-01-01

    The main tasks of KEKB (High Energy Accelerator Research Organization B-physics) operators are beam tuning and injection, operation logging, monitoring of accelerator conditions and safety management. New beam tuning methods are frequently applied to KEKB in order to accomplish high luminosity. In such a situation, various operation tools have been developed by the operators to realize efficient operation. In this paper, we describe effectiveness of tools developed by the operators. (author)

  4. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  5. Online Analytical Processing (OLAP: A Fast and Effective Data Mining Tool for Gene Expression Databases

    Directory of Open Access Journals (Sweden)

    Alkharouf Nadim W.

    2005-01-01

    Full Text Available Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD. A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  6. Online analytical processing (OLAP): a fast and effective data mining tool for gene expression databases.

    Science.gov (United States)

    Alkharouf, Nadim W; Jamison, D Curtis; Matthews, Benjamin F

    2005-06-30

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  7. Process development and tooling design for intrinsic hybrid composites

    Science.gov (United States)

    Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.

    2017-09-01

    Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.

  8. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Fusion Analytics: A Data Integration System for Public Health and Medical Disaster Response Decision Support

    Science.gov (United States)

    Passman, Dina B.

    2013-01-01

    Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending

  10. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    Science.gov (United States)

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  11. [COMETE: a tool to develop psychosocial competences in patient education].

    Science.gov (United States)

    Saugeron, Benoit; Sonnier, Pierre; Marchais, Stéphanie

    2016-01-01

    This article presents a detailed description of the development and use of the COMETE tool. The COMETE tool is designed to help medical teams identify, develop or evaluate psychosocial skills in patient education and counselling. This tool, designed in the form of a briefcase, proposes methodological activities and cards that assess psychosocial skills during a shared educational assessment, group meetings or during an individual evaluation. This tool is part of a support approach for medical teams caring for patients with chronic diseases.

  12. Google Analytics – Index of Resources

    Science.gov (United States)

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  13. A Performance Analytical Strategy for Network-on-Chip Router with Input Buffer Architecture

    Directory of Open Access Journals (Sweden)

    WANG, J.

    2012-11-01

    Full Text Available In this paper, a performance analytical strategy is proposed for Network-on-Chip router with input buffer architecture. First, an analytical model is developed based on semi-Markov process. For the non-work-conserving router with small buffer size, the model can be used to analyze the schedule delay and the average service time for each buffer when given the related parameters. Then, the packet average delay in router is calculated by using the model. Finally, we validate the effectiveness of our strategy by simulation. By comparing our analytical results to simulation results, we show that our strategy successfully captures the Network-on-Chip router performance and it performs better than the state-of-art technology. Therefore, our strategy can be used as an efficiency performance analytical tool for Network-on-Chip design.

  14. Biofluid infrared spectro-diagnostics: pre-analytical considerations for clinical applications.

    Science.gov (United States)

    Lovergne, L; Bouzy, P; Untereiner, V; Garnotel, R; Baker, M J; Thiéfin, G; Sockalingum, G D

    2016-06-23

    Several proof-of-concept studies on the vibrational spectroscopy of biofluids have demonstrated that the methodology has promising potential as a clinical diagnostic tool. However, these studies also show that there is a lack of a standardised protocol in sample handling and preparation prior to spectroscopic analysis. One of the most important sources of analytical errors is the pre-analytical phase. For the technique to be translated into clinics, it is clear that a very strict protocol needs to be established for such biological samples. This study focuses on some of the aspects of the pre-analytical phase in the development of the high-throughput Fourier Transform Infrared (FTIR) spectroscopy of some of the most common biofluids such as serum, plasma and bile. Pre-analytical considerations that can impact either the samples (solvents, anti-coagulants, freeze-thaw cycles…) and/or spectroscopic analysis (sample preparation such as drying, deposit methods, volumes, substrates, operators dependence…) and consequently the quality and the reproducibility of spectral data will be discussed in this report.

  15. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  16. Analytical dose modeling for preclinical proton irradiation of millimetric targets.

    Science.gov (United States)

    Vanstalle, Marie; Constanzo, Julie; Karakaya, Yusuf; Finck, Christian; Rousseau, Marc; Brasse, David

    2018-01-01

    Due to the considerable development of proton radiotherapy, several proton platforms have emerged to irradiate small animals in order to study the biological effectiveness of proton radiation. A dedicated analytical treatment planning tool was developed in this study to accurately calculate the delivered dose given the specific constraints imposed by the small dimensions of the irradiated areas. The treatment planning system (TPS) developed in this study is based on an analytical formulation of the Bragg peak and uses experimental range values of protons. The method was validated after comparison with experimental data from the literature and then compared to Monte Carlo simulations conducted using Geant4. Three examples of treatment planning, performed with phantoms made of water targets and bone-slab insert, were generated with the analytical formulation and Geant4. Each treatment planning was evaluated using dose-volume histograms and gamma index maps. We demonstrate the value of the analytical function for mouse irradiation, which requires a targeting accuracy of 0.1 mm. Using the appropriate database, the analytical modeling limits the errors caused by misestimating the stopping power. For example, 99% of a 1-mm tumor irradiated with a 24-MeV beam receives the prescribed dose. The analytical dose deviations from the prescribed dose remain within the dose tolerances stated by report 62 of the International Commission on Radiation Units and Measurements for all tested configurations. In addition, the gamma index maps show that the highly constrained targeting accuracy of 0.1 mm for mouse irradiation leads to a significant disagreement between Geant4 and the reference. This simulated treatment planning is nevertheless compatible with a targeting accuracy exceeding 0.2 mm, corresponding to rat and rabbit irradiations. Good dose accuracy for millimetric tumors is achieved with the analytical calculation used in this work. These volume sizes are typical in mouse

  17. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  18. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, Edwin J.; Frambach, Ruud T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client

  19. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    Science.gov (United States)

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products.

  20. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  1. Nuclear imaging drug development tools

    International Nuclear Information System (INIS)

    Buchanan, L.; Jurek, P.; Redshaw, R.

    2007-01-01

    This article describes the development of nuclear imaging as an enabling technology in the pharmaceutical industry. Molecular imaging is maturing into an important tool with expanding applications from validating that a drug reaches the intended target through to market launch of a new drug. Molecular imaging includes anatomical imaging of organs or tissues, computerized tomography (CT), magnetic resonance imaging (MRI) and ultrasound.

  2. Target normal sheath acceleration analytical modeling, comparative study and developments

    International Nuclear Information System (INIS)

    Perego, C.; Batani, D.; Zani, A.; Passoni, M.

    2012-01-01

    Ultra-intense laser interaction with solid targets appears to be an extremely promising technique to accelerate ions up to several MeV, producing beams that exhibit interesting properties for many foreseen applications. Nowadays, most of all the published experimental results can be theoretically explained in the framework of the target normal sheath acceleration (TNSA) mechanism proposed by Wilks et al. [Phys. Plasmas 8(2), 542 (2001)]. As an alternative to numerical simulation various analytical or semi-analytical TNSA models have been published in the latest years, each of them trying to provide predictions for some of the ion beam features, given the initial laser and target parameters. However, the problem of developing a reliable model for the TNSA process is still open, which is why the purpose of this work is to enlighten the present situation of TNSA modeling and experimental results, by means of a quantitative comparison between measurements and theoretical predictions of the maximum ion energy. Moreover, in the light of such an analysis, some indications for the future development of the model proposed by Passoni and Lontano [Phys. Plasmas 13(4), 042102 (2006)] are then presented.

  3. Teaching Syllogistics Using E-learning Tools

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Sandborg-Petersen, Ulrik; Thorvaldsen, Steinar

    2016-01-01

    This paper is a study of various strategies for teaching syllogistics as part of a course in basic logic. It is a continuation of earlier studies involving practical experiments with students of Communication using the Syllog system, which makes it possible to develop e-learning tools and to do l...... involving different teaching methods will be compared.......This paper is a study of various strategies for teaching syllogistics as part of a course in basic logic. It is a continuation of earlier studies involving practical experiments with students of Communication using the Syllog system, which makes it possible to develop e-learning tools and to do...... learning analytics based on log-data. The aim of the present paper is to investigate whether the Syllog e-learning tools can be helpful in logic teaching in order to obtain a better understanding of logic and argumentation in general and syllogisms in particular. Four versions of a course in basic logic...

  4. Experimental evaluation of tool run-out in micro milling

    Science.gov (United States)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  5. Advancing analytical algorithms and pipelines for billions of microbial sequences.

    Science.gov (United States)

    Gonzalez, Antonio; Knight, Rob

    2012-02-01

    The vast number of microbial sequences resulting from sequencing efforts using new technologies require us to re-assess currently available analysis methodologies and tools. Here we describe trends in the development and distribution of software for analyzing microbial sequence data. We then focus on one widely used set of methods, dimensionality reduction techniques, which allow users to summarize and compare these vast datasets. We conclude by emphasizing the utility of formal software engineering methods for the development of computational biology tools, and the need for new algorithms for comparing microbial communities. Such large-scale comparisons will allow us to fulfill the dream of rapid integration and comparison of microbial sequence data sets, in a replicable analytical environment, in order to describe the microbial world we inhabit. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Indigenous youth-developed self-assessment: The Personal Balance Tool.

    Science.gov (United States)

    Barraza, Rachelle; Bartgis, Jami

    2016-01-01

    The Fresno American Indian Health Project (FAIHP) Youth Council developed and pilot tested a strength-based, holistic, and youth-friendly self-assessment tool grounded in the Medicine Wheel, a framework and theoretical orientation for teaching wellness in many tribal communities. This paper summarizes the development of the Youth Personal Balance Tool and the methods used for tool revisions through two separate pilot studies and ongoing process evaluations across 3 years. Using a community-based participatory evaluation model, FAIHP leveraged community resources to implement an annual youth Gathering of Native Americans to support youth in healing from historical and intergenerational trauma and restoring communities to balance by making them a part of the solution. This tool is one of many outcomes of their work. The Youth Council is offering the tool as a gift (in line with the cultural value of generosity) to other Indigenous communities that are searching for culturally competent self-assessment tools for youth. The authors believe this tool has the potential to progress the field in strength-based, holistic, youth-friendly assessment as a culturally competent method for Indigenous evaluation and research.

  7. Impact of population and economic growth on carbon emissions in Taiwan using an analytic tool STIRPAT

    Directory of Open Access Journals (Sweden)

    Jong-Chao Yeh

    2017-01-01

    Full Text Available Carbon emission has increasingly become an issue of global concern because of climate change. Unfortunately, Taiwan was listed as top 20 countries of carbon emission in 2014. In order to provide appropriate measures to control carbon emission, it appears that there is an urgent need to address how such factors as population and economic growth impact the emission of carbon dioxide in any developing countries. In addition to total population, both the percentages of population living in urban area (i.e., urbanization percentage, and non-dependent population may also serve as limiting factors. On the other hand, the total energy-driven gross domestic production (GDP and the percentage of GDP generated by the manufacturing industries are assessed to see their respective degree of impact on carbon emission. Therefore, based on the past national data in the period 1990–2014 in Taiwan, an analytic tool of Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT was employed to see how well those aforementioned factors can describe their individual potential impact on global warming, which is measured by the total amount of carbon emission into the atmosphere. Seven scenarios of STIRPAT model were proposed and tested statistically for the significance of each proposed model. As a result, two models were suggested to predict the impact of carbon emission due to population and economic growth by the year 2025 in Taiwan.

  8. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  9. OOTW Force Design Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.

    1999-05-01

    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  10. Analyticity and the Global Information Field

    Directory of Open Access Journals (Sweden)

    Evgeni A. Solov'ev

    2015-03-01

    Full Text Available The relation between analyticity in mathematics and the concept of a global information field in physics is reviewed. Mathematics is complete in the complex plane only. In the complex plane, a very powerful tool appears—analyticity. According to this property, if an analytic function is known on the countable set of points having an accumulation point, then it is known everywhere. This mysterious property has profound consequences in quantum physics. Analyticity allows one to obtain asymptotic (approximate results in terms of some singular points in the complex plane which accumulate all necessary data on a given process. As an example, slow atomic collisions are presented, where the cross-sections of inelastic transitions are determined by branch-points of the adiabatic energy surface at a complex internuclear distance. Common aspects of the non-local nature of analyticity and a recently introduced interpretation of classical electrodynamics and quantum physics as theories of a global information field are discussed.

  11. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  12. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry

    OpenAIRE

    Marek Tobiszewski; Mariusz Marć; Agnieszka Gałuszka; Jacek Namieśnik

    2015-01-01

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-establis...

  13. Developing e-marketing tools : Case company: CASTA Ltd.

    OpenAIRE

    Nguyen, Chi

    2014-01-01

    This Bachelor’s thesis topic is developing e-marketing tools for the B2C sector of CASTA Ltd. The final outcome is a set of online marketing tools guidelines that can improve business activities, especially marketing effectiveness. Based on the company’s status as a novice in online marketing field, the thesis will focus on the basic level of three specific online marketing tools, instead of covering the whole e-marketing subject. The theoretical framework first describes the concept of e...

  14. Development of configuration risk management tool

    International Nuclear Information System (INIS)

    Masuda, Takahiro; Doi, Eiji

    2003-01-01

    Tokyo Electric Power Company (referred to as TEPCO hereinafter), and other Japanese utilities as well, have been trying to improve the capacity factor of their Nuclear Power Plants (NPPs) through modernization of Operation and Maintenance strategy. TEPCO intends to apply risk information to O and M field with maintaining or even improving both safety and production efficiency. Under these situations, TEPCO with some BWR utilities started to develop a Configuration Risk Management (CRM) tool that can estimate risk in various plant conditions due to configuration changes during outage. Moreover, we also intend to apply CRM to on-line maintenance (OLM) in the near future. This tool can calculate the Core Damage Frequency (CDF) according to given plant condition, such as SSCs availability, decay heat level and the inventory of coolant in both outage state and full-power operation. From deterministic viewpoint, whether certain configuration meet the related requirements of Technical Specifications. User-friendly interface is one of the important features of this tool because this enables the site engineers with little experience in PSA to quantify and utilize the risk information by this tool. (author)

  15. Developing a mapping tool for tablets

    Science.gov (United States)

    Vaughan, Alan; Collins, Nathan; Krus, Mike

    2014-05-01

    Digital field mapping offers significant benefits when compared with traditional paper mapping techniques in that it provides closer integration with downstream geological modelling and analysis. It also provides the mapper with the ability to rapidly integrate new data with existing databases without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. In order to achieve these benefits, a number of PC-based digital mapping tools are available which have been developed for specific communities, eg the BGS•SIGMA project, Midland Valley's FieldMove®, and a range of solutions based on ArcGIS® software, which can be combined with either traditional or digital orientation and data collection tools. However, with the now widespread availability of inexpensive tablets and smart phones, a user led demand for a fully integrated tablet mapping tool has arisen. This poster describes the development of a tablet-based mapping environment specifically designed for geologists. The challenge was to deliver a system that would feel sufficiently close to the flexibility of paper-based geological mapping while being implemented on a consumer communication and entertainment device. The first release of a tablet-based geological mapping system from this project is illustrated and will be shown as implemented on an iPad during the poster session. Midland Valley is pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.

  16. Evaluation and selection of CASE tool for SMART OTS development

    International Nuclear Information System (INIS)

    Park, K. O; Seo, S. M.; Seo, Y. S.; Koo, I. S.; Jang, M. H.

    1999-01-01

    CASE(Computer-Aided Software Engineering) tool is a software that aids in software engineering activities such as requirement analysis, design, testing, configuration management, and project management. The evaluation and selection of commercial CASE tools for the specific software development project is not a easy work because the technical ability of an evaluator and the maturity of a software development organization are required. In this paper, we discuss selection strategies, characteristic survey, evaluation criteria, and the result of CASE tool selection for the development of SMART(System-integrated Modular Advanced ReacTor) OTS(Operator Training Simulator)

  17. A Tool for Conceptualising in PSS development

    DEFF Research Database (Denmark)

    Matzen, Detlef; McAloone, Timothy Charles

    2006-01-01

    This paper introduces a tool for conceptualising in the development of product/servicesystems (PSS), based upon the modelling of service activities. Our argumentation is built on two previous articles by the same author, previously presented at the 16. Symposium “Design for X” [1] and the 9th Int...... the integrated consideration of the customers’ activities, possible PSS offerings and beneficial partnering options (i.e. between different supplier companies) within the delivery value chain.......This paper introduces a tool for conceptualising in the development of product/servicesystems (PSS), based upon the modelling of service activities. Our argumentation is built on two previous articles by the same author, previously presented at the 16. Symposium “Design for X” [1] and the 9th...... International Design Conference [2]. In this contribution, we take the step from a fundamental understanding of the phenomenon to creating a normative exploitation of this understanding for PSS concept development. The developed modelling technique is based on the Customer Activity Cycle (CAC) model...

  18. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    Directory of Open Access Journals (Sweden)

    Miroslav Pohanka

    2015-06-01

    Full Text Available Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE. The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.. It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance.

  19. The development of tool manufacture in humans: what helps young children make innovative tools?

    Science.gov (United States)

    Chappell, Jackie; Cutting, Nicola; Apperly, Ian A; Beck, Sarah R

    2013-11-19

    We know that even young children are proficient tool users, but until recently, little was known about how they make tools. Here, we will explore the concepts underlying tool making, and the kinds of information and putative cognitive abilities required for children to manufacture novel tools. We will review the evidence for novel tool manufacture from the comparative literature and present a growing body of data from children suggesting that innovation of the solution to a problem by making a tool is a much more challenging task than previously thought. Children's difficulty with these kinds of tasks does not seem to be explained by perseveration with unmodified tools, difficulty with switching to alternative strategies, task pragmatics or issues with permission. Rather, making novel tools (without having seen an example of the required tool within the context of the task) appears to be hard, because it is an example of an 'ill-structured problem'. In this type of ill-structured problem, the starting conditions and end goal are known, but the transformations and/or actions required to get from one to the other are not specified. We will discuss the implications of these findings for understanding the development of problem-solving in humans and other animals.

  20. The development of tool manufacture in humans: what helps young children make innovative tools?

    Science.gov (United States)

    Chappell, Jackie; Cutting, Nicola; Apperly, Ian A.; Beck, Sarah R.

    2013-01-01

    We know that even young children are proficient tool users, but until recently, little was known about how they make tools. Here, we will explore the concepts underlying tool making, and the kinds of information and putative cognitive abilities required for children to manufacture novel tools. We will review the evidence for novel tool manufacture from the comparative literature and present a growing body of data from children suggesting that innovation of the solution to a problem by making a tool is a much more challenging task than previously thought. Children's difficulty with these kinds of tasks does not seem to be explained by perseveration with unmodified tools, difficulty with switching to alternative strategies, task pragmatics or issues with permission. Rather, making novel tools (without having seen an example of the required tool within the context of the task) appears to be hard, because it is an example of an ‘ill-structured problem’. In this type of ill-structured problem, the starting conditions and end goal are known, but the transformations and/or actions required to get from one to the other are not specified. We will discuss the implications of these findings for understanding the development of problem-solving in humans and other animals. PMID:24101620

  1. Designing a tool for curriculum leadership development in postgraduate programs

    Directory of Open Access Journals (Sweden)

    M Avizhgan

    2016-07-01

    Full Text Available Introduction: Leadership in the area of curriculum development is increasingly important as we look for ways to improve our programmes and practices. In curriculum studies, leadership has received little attention. Considering the lack of an evaluation tool with objective criteria in postgraduate curriculum leadership process, this study aimed to design a specific tool and determine the validity and reliability of the tool. Method: This study is a methodological research.  At first, domains and items of the tool were determined through expert interviews and literature review. Then, using Delphi technique, 54 important criteria were developed. A panel of experts was used to confirm content and face validity. Reliability was determined by a descriptive study in which 30 faculties from two of Isfahan universities and was estimated by internal consistency. The data were analyzed by SPSS software, using Pearson Correlation Coefficient and reliability analysis. Results: At first, considering the definition of curriculum leadership determined the domains and items of the tool and they were developed primary tool. Expert’s faculties’ views were used in deferent stages of development and psychometry. The tool internal consistency with Cronbach's alpha coefficient times was 96.5. This was determined for each domain separately. Conclution: Applying this instrument can improve the effectiveness of curriculum leadership. Identifying the characteristics of successful and effective leaders, and utilizing this knowledge in developing and implementing curriculum might help us to have better respond to the changing needs of our students, teachers and schools of tomorrow.

  2. An integrated environment for developing object-oriented CAE tools

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, P.; Ryba, M.; Baitinger, U.G. [Integrated System Engeneering, Stuttgart (Germany)

    1996-12-31

    This paper presents how object oriented techniques can applied to improve the development of CAE tools. For the design of modular and reusable software systems we use predefined and well tested building blocks. These building blocks are reusable software components based on object-oriented technology which allows the assembling of software systems. Today CAE tools are typically very complex and computation extensive. Therefore we need a concept, that join the advantages of the object-oriented paradigm with the advantages of parallel and distributed programming. So we present a design environment for the development of concurrent-object oriented CAE tools called CoDO.

  3. Information and Analytic Maintenance of Nanoindustry Development

    Directory of Open Access Journals (Sweden)

    Glushchenko Aleksandra Vasilyevna

    2015-05-01

    Full Text Available The successful course of nanotechnological development in many respects depends on the volume and quality of information provided to external and internal users. The objective of the present research is to reveal the information requirements of various groups of users for effective management of nanotech industry and to define ways of their most effective satisfaction. The authors also aim at developing the system of the indicators characterizing the current state and the dynamic parameters of nanotech industry development. On the basis of the conducted research the need of information system of nanotech industry development is proved. The information interrelations of subjects of nanotech industry for development of communicative function of the account which becomes dominating in comparison with control function are revealed. The information needs of users of financial and non-financial information are defined. The stages of its introduction, since determination of character, volume, the list and degree of efficiency of information before creation of system of the administrative reporting, the analysis and control are in detail registered. The information and analytical system is focused on the general assessment of efficiency and the major economic indicators, the general tendencies of development of nanotech industry, possible reserves of increasing the efficiency of their functioning. The authors develop pthe system of the indicators characterizing the advancement of nanotech industry and allowing to estimate innovative activity in the sphere of nanotech industry, to calculate intensity of nano-innovations costs, to define the productivity and efficiency of nanotech industry in branch, the region, national economy in general.

  4. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    Science.gov (United States)

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  5. Data Intensive Architecture for Scalable Cyber Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Bryan K.; Johnson, John R.; Critchlow, Terence J.

    2011-11-15

    Cyber analysts are tasked with the identification and mitigation of network exploits and threats. These compromises are difficult to identify due to the characteristics of cyber communication, the volume of traffic, and the duration of possible attack. It is necessary to have analytical tools to help analysts identify anomalies that span seconds, days, and weeks. Unfortunately, providing analytical tools effective access to the volumes of underlying data requires novel architectures, which is often overlooked in operational deployments. Our work is focused on a summary record of communication, called a flow. Flow records are intended to summarize a communication session between a source and a destination, providing a level of aggregation from the base data. Despite this aggregation, many enterprise network perimeter sensors store millions of network flow records per day. The volume of data makes analytics difficult, requiring the development of new techniques to efficiently identify temporal patterns and potential threats. The massive volume makes analytics difficult, but there are other characteristics in the data which compound the problem. Within the billions of records of communication that transact, there are millions of distinct IP addresses involved. Characterizing patterns of entity behavior is very difficult with the vast number of entities that exist in the data. Research has struggled to validate a model for typical network behavior with hopes it will enable the identification of atypical behavior. Complicating matters more, typically analysts are only able to visualize and interact with fractions of data and have the potential to miss long term trends and behaviors. Our analysis approach focuses on aggregate views and visualization techniques to enable flexible and efficient data exploration as well as the capability to view trends over long periods of time. Realizing that interactively exploring summary data allowed analysts to effectively identify

  6. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  7. Database development tool DELPHI and its application

    International Nuclear Information System (INIS)

    Ma Mei

    2000-01-01

    The authors described the progress of the software development technologies and tools, the features and performances of Borland Delphi and a software development instance which is the Management Information System of Tank region storage and transportation control center for Zhenhai Refining and Chemical CO., Ltd. in the Zhejiang province

  8. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    Science.gov (United States)

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  9. Development of tools for automatic generation of PLC code

    OpenAIRE

    Koutli, Maria; Chasapis, Georgios; Rochez, Jacques

    2014-01-01

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible P...

  10. Software Development Methods and Tools: a New Zealand study

    OpenAIRE

    Chris Phillips; Elizabeth Kemp; Duncan Hedderley

    2005-01-01

    This study is a more detailed follow-up to a preliminary investigation of the practices of software engineers in New Zealand. The focus of this study is on the methods and tools used by software developers in their current organisation. The project involved detailed questionnaires being piloted and sent out to several hundred software developers. A central part of the research involved the identification of factors affecting the use and take-up of existing software development tools in the wo...

  11. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    Laboratories to quantify the uncertainty of measurement results, and the fact that this standard is used as a basis for the development and implementation of quality management systems in many laboratories performing nuclear analytical measurements, triggered the demand for specific guidance to cover uncertainty issues of nuclear analytical methods. The demand was recognized by the IAEA and a series of examples was worked out by a group of consultants in 1998. The diversity and complexity of the topics addressed delayed the publication of a technical guidance report, but the exchange of views among the experts was also beneficial and led to numerous improvements and additions with respect to the initial version. This publication is intended to assist scientists using nuclear analytical methods in assessing and quantifying the sources of uncertainty of their measurements. The numerous examples provide a tool for applying the principles elaborated in the GUM and EURACHEM/CITAC publications to their specific fields of interest and for complying with the requirements of current quality management standards for testing and calibration laboratories. It also provides a means for the worldwide harmonization of approaches to uncertainty quantification and thereby contributes to enhanced comparability and competitiveness of nuclear analytical measurements

  12. Concurrence of big data analytics and healthcare: A systematic review.

    Science.gov (United States)

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  13. Development of remote handling tools for glove box

    International Nuclear Information System (INIS)

    Tomita, Yutaka; Nemoto, Takeshi; Denuma, Akio; Todokoro, Akio

    1996-01-01

    For a part of advanced nuclear fuel recycling technology development, we will separate and recover Americium from the MOX fuel scrap by solvent extraction. When we carry this examination, reduction of exposure from Americium-241 is one of important problems. To solve this problem fundamentally, we studied many joints type of the remote handling tools for glove box and produced a trial production machine. Also, we carried out basic function examinations of it. As a result, we got the prospect of development of the remote handling tools which could treat Americium in glove box. (author)

  14. Developing a Conceptual Design Engineering Toolbox and its Tools

    Directory of Open Access Journals (Sweden)

    R. W. Vroom

    2004-01-01

    Full Text Available In order to develop a successful product, a design engineer needs to pay attention to all relevant aspects of that product. Many tools are available, software, books, websites, and commercial services. To unlock these potentially useful sources of knowledge, we are developing C-DET, a toolbox for conceptual design engineering. The idea of C-DET is that designers are supported by a system that provides them with a knowledge portal on one hand, and a system to store their current work on the other. The knowledge portal is to help the designer to find the most appropriate sites, experts, tools etc. at a short notice. Such a toolbox offers opportunities to incorporate extra functionalities to support the design engineering work. One of these functionalities could be to help the designer to reach a balanced comprehension in his work. Furthermore C-DET enables researchers in the area of design engineering and design engineers themselves to find each other or their work earlier and more easily. Newly developed design tools that can be used by design engineers but have not yet been developed up to a commercial level could be linked to by C-DET. In this way these tools can be evaluated in an early stage by design engineers who would like to use them. This paper describes the first prototypes of C-DET, an example of the development of a design tool that enables designers to forecast the use process and an example of the future functionalities of C-DET such as balanced comprehension.

  15. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  16. Implementing analytics a blueprint for design, development, and adoption

    CERN Document Server

    Sheikh, Nauman

    2013-01-01

    Implementing Analytics demystifies the concept, technology and application of analytics and breaks its implementation down to repeatable and manageable steps, making it possible for widespread adoption across all functions of an organization. Implementing Analytics simplifies and helps democratize a very specialized discipline to foster business efficiency and innovation without investing in multi-million dollar technology and manpower. A technology agnostic methodology that breaks down complex tasks like model design and tuning and emphasizes business decisions rather than the technology behi

  17. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  18. Development of bore tools for pipe welding and cutting

    International Nuclear Information System (INIS)

    Oka, Kiyoshi; Ito, Akira; Takiguchi, Yuji

    1998-01-01

    In the International Thermonuclear Experimental Reactor (ITER), in-vessel components replacement and maintenance requires that connected cooling pipes be cut and removed beforehand and that new components be installed to which cooling pipes must be rewelded. All welding must be inspected for soundness after completion. These tasks require a new task concept for ensuring shielded areas and access from narrow ports. Thus, it became necessary to develop autonomous locomotion welding and cutting tools for branch and main pipes to weld pipes by in-pipe access; a system was proposed that cut and welded branch and main pipes after passing inside pipe curves, and elemental technologies developed. This paper introduces current development in tools for welding and cutting branch pipes and other tools for welding and cutting the main pipe. (author)

  19. Development of bore tools for pipe welding and cutting

    Energy Technology Data Exchange (ETDEWEB)

    Oka, Kiyoshi; Ito, Akira; Takiguchi, Yuji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    In the International Thermonuclear Experimental Reactor (ITER), in-vessel components replacement and maintenance requires that connected cooling pipes be cut and removed beforehand and that new components be installed to which cooling pipes must be rewelded. All welding must be inspected for soundness after completion. These tasks require a new task concept for ensuring shielded areas and access from narrow ports. Thus, it became necessary to develop autonomous locomotion welding and cutting tools for branch and main pipes to weld pipes by in-pipe access; a system was proposed that cut and welded branch and main pipes after passing inside pipe curves, and elemental technologies developed. This paper introduces current development in tools for welding and cutting branch pipes and other tools for welding and cutting the main pipe. (author)

  20. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    Science.gov (United States)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify

  1. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    Science.gov (United States)

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  2. Continuing data assessment of 16-inch williams pipeline inspected with the recently developed ultrasonic crack detection tool

    International Nuclear Information System (INIS)

    Katz, D.C.; Gao, M.; Elboujdaini, M.; Li, J.

    2003-01-01

    The in-line-Inspection of Williams' Gas West Pipeline in September 2001 was successfully completed using the newly developed 16-inch UltraScan CD tool of GE PII Pipeline Solutions. The particular pipeline section inspected was known to be affected by Stress Corrosion Cracking (SCC). The inspection was carried out using the liquid batching technique developed by PII Pipeline Solutions. A special launcher and receiver barrel was designed to enable the handling of a series of three batching pigs in front of and two behind the inspection tool. A manifold of 'kicker lines' was mounted to the barrel to launch the batching pigs and the inspection tool. The main benefits of this new design were minimizing operational downtime, ensuring complete air/natural gas displacement from the launcher, and providing for a smoother launch procedure. Due to the large elevation changes within the pipeline section, a key concern was maintaining pig velocity within 1m/s for adequate data resolution. Rather than rely on a general 'rule of thumb', a transient analysis was performed to define a range of possible batch sizes and better understand the expected pressure gradients while pumping the water slug. Based on actual data collected during this successful run, the transient model will be refined to better handle friction effects between the sealing cups and disks in future batch inspection runs. The pig data was successfully acquired, processed, verified, and excavations performed in 2002. Results from the twenty digs will be presented, as well as a discussion of the on-going fracture mechanics assessments which are being used to develop an overall integrity management plan for the continued, safe operation of the pipeline. To better understand the mechanism for SCC and enhance the integrity management plan, key metallurgical and environment elements are being investigated with advanced analytical tools, including high resolution SEM and EDS. In-situ crack growth monitoring system is

  3. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    Science.gov (United States)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  4. Development of special analytical system for determination of free acid

    International Nuclear Information System (INIS)

    Zhang Lihua; Wu Jizong; Liu Huanliang; Liu Quanwei; Fan Dejun; Su Tao

    2008-01-01

    The determination of free-acid plays an important role in spent fuel reprocessing analysis. Its work accounts for about 30% of all analytical work in reprocessing. It is necessary to study and develop a special fast analytical system for determination of free acid. The special analytical system is particularly applicable to determination of free acid in high-level radioactive environment, which is composed of an optical fiber spectrophotometer and an automatic sample-in device. Small sample-in volume needed, fast procedure, easy operation and physical protection are its advantages. All kinds of performance and parameters satisfy the requirements of spent fuel reprocessing control analysis. For long-distance determination, the optical fiber spectrophotometer is connected with an 4.5 meters long optical fiber. To resolve the change of 0.1 mol/L acidity, the measuring optical path is 2 cm. Mass of 10-20 μm in diameter optical fibers are assembled. The optical fiber probe is composed of a reflecting mirror and a concave mirror on the top of optical fibers. To eliminate the interference of external light, a stainless steel measuring chamber is used. The automatic sample-in device is composed of state valve, quantifying pump and pipe. The sample-in precision of 15 μl and 35 μl quantifying loops is better than 0.5%. The special analytical system takes less than 7 minutes to complete one measurement. The linear range is 0.5 mol/L-3.5 mol/L. The relative standard deviation is better than 2.0% when the concentration of the free acid is about 2.0 mol/L. For samples in different medium, the results are comparable with the method of pH titration of determining the free acid in reprocessing. (authors)

  5. Applying Pragmatics Principles for Interaction with Visual Analytics.

    Science.gov (United States)

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  6. Born analytical or adopted over time? a study investigating if new analytical tools can ensure the survival of market oriented startups.

    OpenAIRE

    Skogen, Hege Janson; De la Cruz, Kai

    2017-01-01

    Masteroppgave(MSc) in Master of Science in Strategic Marketing Management - Handelshøyskolen BI, 2017 This study investigates whether the prevalence of technological advances within quantitative analytics moderates the effect market orientation has on firm performance, and if startups can take advantage of the potential opportunities to ensure their own survival. For this purpose, the authors review previous literature in marketing orientation, startups, marketing analytics, an...

  7. Development and implementation of an analytical quality assurance plan at the Hanford site

    International Nuclear Information System (INIS)

    Kuhl-Klinger, K.J.; Taylor, C.D.; Kawabata, K.K.

    1995-08-01

    The Hanford Analytical Services Quality Assurance Plan (HASQAP) provides a uniform standard for onsite and offsite laboratories performing analytical work in support of Hanford Site environmental cleanup initiatives. The Hanford Site is a nuclear site that originated during World War 11 and has a legacy of environmental clean up issues. In early 1993, the need for and feasibility of developing a quality assurance plan to direct all analytical activities performed to support environmental cleanup initiatives set forth in the Hanford Federal Facility Agreement and Consent Order were discussed. Several group discussions were held and from them came the HASQAP. This document will become the quality assurance guidance document in a Federal Facility Agreement and Consent Order. This paper presents the mechanics involved in developing a quality assurance plan for this scope of activity, including the approach taken to resolve the variability of quality control requirements driven by numerous regulations. It further describes the consensus building process and how the goal of uniting onsite and offsite laboratories as well as inorganic, organic, and radioanalytic disciplines under a common understanding of basic quality control concepts was achieved

  8. A risk assessment tool for contaminated sites in low-permeability fractured media

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Binning, Philip John; Jørgensen, Peter R.

    2011-01-01

    A risk assessment tool for contaminated sites in low-permeability fractured media is developed, based on simple transient and steady-state analytical solutions. The discrete fracture (DF) tool, which explicitly accounts for the transport along fractures, covers different source geometries...... and history (including secondary sources) and can be applied to a wide range of compounds. The tool successfully simulates published data from short duration column and field experiments. The use for risk assessment is illustrated by three typical risk assessment case studies, involving pesticides...

  9. A Multi-Level Middle-Out Cross-Zooming Approach for Large Graph Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Pak C.; Mackey, Patrick S.; Cook, Kristin A.; Rohrer, Randall M.; Foote, Harlan P.; Whiting, Mark A.

    2009-10-11

    This paper presents a working graph analytics model that embraces the strengths of the traditional top-down and bottom-up approaches with a resilient crossover concept to exploit the vast middle-ground information overlooked by the two extreme analytical approaches. Our graph analytics model is developed in collaboration with researchers and users, who carefully studied the functional requirements that reflect the critical thinking and interaction pattern of a real-life intelligence analyst. To evaluate the model, we implement a system prototype, known as GreenHornet, which allows our analysts to test the theory in practice, identify the technological and usage-related gaps in the model, and then adapt the new technology in their work space. The paper describes the implementation of GreenHornet and compares its strengths and weaknesses against the other prevailing models and tools.

  10. Sunfall: a collaborative visual analytics system for astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, Cecilia R.; Aragon, Cecilia R.; Bailey, Stephen J.; Poon, Sarah; Runge, Karl; Thomas, Rollin C.

    2008-07-07

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project.

  11. Sunfall: a collaborative visual analytics system for astrophysics

    International Nuclear Information System (INIS)

    Aragon, C R; Bailey, S J; Poon, S; Runge, K; Thomas, R C

    2008-01-01

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project

  12. Sunfall: a collaborative visual analytics system for astrophysics

    Energy Technology Data Exchange (ETDEWEB)

    Aragon, C R; Bailey, S J; Poon, S; Runge, K; Thomas, R C [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)], E-mail: CRAragon@lbl.gov

    2008-07-15

    Computational and experimental sciences produce and collect ever-larger and complex datasets, often in large-scale, multi-institution projects. The inability to gain insight into complex scientific phenomena using current software tools is a bottleneck facing virtually all endeavors of science. In this paper, we introduce Sunfall, a collaborative visual analytics system developed for the Nearby Supernova Factory, an international astrophysics experiment and the largest data volume supernova search currently in operation. Sunfall utilizes novel interactive visualization and analysis techniques to facilitate deeper scientific insight into complex, noisy, high-dimensional, high-volume, time-critical data. The system combines novel image processing algorithms, statistical analysis, and machine learning with highly interactive visual interfaces to enable collaborative, user-driven scientific exploration of supernova image and spectral data. Sunfall is currently in operation at the Nearby Supernova Factory; it is the first visual analytics system in production use at a major astrophysics project.

  13. The Bristol Radiology Report Assessment Tool (BRRAT): Developing a workplace-based assessment tool for radiology reporting skills

    International Nuclear Information System (INIS)

    Wallis, A.; Edey, A.; Prothero, D.; McCoubrie, P.

    2013-01-01

    Aim: To review the development of a workplace-based assessment tool to assess the quality of written radiology reports and assess its reliability, feasibility, and validity. Materials and methods: A comprehensive literature review and rigorous Delphi study enabled the development of the Bristol Radiology Report Assessment Tool (BRRAT), which consists of 19 questions and a global assessment score. Three assessors applied the assessment tool to 240 radiology reports provided by 24 radiology trainees. Results: The reliability coefficient for the 19 questions was 0.79 and the equivalent coefficient for the global assessment scores was 0.67. Generalizability coefficients demonstrate that higher numbers of assessors and assessments are needed to reach acceptable levels of reliability for summative assessments due to assessor subjectivity. Conclusion: The study methodology gives good validity and strong foundation in best-practice. The assessment tool developed for radiology reporting is reliable and most suited to formative assessments

  14. Pupils, Tools and the Zone of Proximal Development

    Science.gov (United States)

    Abtahi, Yasmine

    2018-01-01

    In this article, I use the Vygotskian concept of the Zone of Proximal Development (ZPD) to examine the learning experience of two grade seven pupils as they attempted to solve an addition of fractions problem using fraction strips. The aim is to highlight how tools can facilitate the enactment of a ZPD, within which the tool provides the guidance.…

  15. Rethinking Visual Analytics for Streaming Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    2017-01-01

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between the two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive

  16. Visual programming for next-generation sequencing data analytics.

    Science.gov (United States)

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  17. National Energy Audit Tool for Multifamily Buildings Development Plan

    Energy Technology Data Exchange (ETDEWEB)

    Malhotra, Mini [ORNL; MacDonald, Michael [Sentech, Inc.; Accawi, Gina K [ORNL; New, Joshua Ryan [ORNL; Im, Piljae [ORNL

    2012-03-01

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherization of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional

  18. Mathematical support for automated geometry analysis of lathe machining of oblique peakless round-nose tools

    Science.gov (United States)

    Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.

    2017-01-01

    Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.

  19. Making advanced analytics work for you.

    Science.gov (United States)

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  20. Tools for man-machine interface development in accelerator control applications

    International Nuclear Information System (INIS)

    Kopylov, L.; Mikhev, M.; Trofimov, N.; Yurpalov, V.

    1994-01-01

    For the UNK Project a development of the Accelerator Control Applications is in the progress. These applications will use a specific Graphical User Interface for data presentation and accelerator parameter management. A number of tools have been developed based on the Motif Tool Kit. They contain a set of problem oriented screen templates and libraries. Using these tools, full scale prototype applications of the UNK Tune and Orbit measurement and correction were developed and are described, as examples. A subset of these allows the creation of the synoptic control screens from the Autocad pictures files and Oracle DB equipment descriptions. The basic concepts and a few application examples are presented. ((orig.))

  1. Assessing Development Impacts Associated with Low Emission Development Strategies: Lessons Learned from Pilot Efforts in Kenya and Montenegro

    Energy Technology Data Exchange (ETDEWEB)

    Cox, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Katz, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wurtenberger, L. [Energy Research Centre of the Netherlands (ECN), Petten (Netherlands)

    2014-01-01

    Low emission development strategies (LEDS) articulate economy-wide policies and implementation plans designed to enable a country to meet its long-term development objectives while reducing greenhouse gas emissions. A development impact assessment tool was developed to inform an analytically robust and transparent prioritization of LEDS actions based on their economic, social, and environmental impacts. The graphical tool helps policymakers communicate the development impacts of LEDS options and identify actions that help meet both emissions reduction and development goals. This paper summarizes the adaptation and piloting of the tool in Kenya and Montenegro. The paper highlights strengths of the tool and discusses key needs for improving it.

  2. Soviet-designed pressurized water reactor symptomatic emergency operating instruction analytical procedure: approach, methodology, development and application

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1999-01-01

    A symptom approach to the analytical validation of symptom-based EOPs includes: (1) Identification of critical safety functions to the maintenance of fission product barrier integrity; (2) Identification of the symptoms which manifest an impending challenge to critical safety function maintenance; (3) Development of a symptomatic methodology to delineate bounding plant transient response modes; (4) Specification of bounding scenarios; (5) Development of a systematic calculational approach consistent with the objectives of the methodology; (6) Performance of thermal-hydraulic computer code calculations implementing the analytical methodology; (7) Interpretation of the analytical results on the basis of information available to the operator; (8) Application of the results to the validation of the proposed operator actions; (9) Production of a technical basis document justifying the proposed operator actions. (author)

  3. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov (United States)

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  4. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  5. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  6. Connectivity among computer-aided engineering methods, procedures, and tools used in developing the SSC collider magnets

    International Nuclear Information System (INIS)

    Kallas, N.; Jalloh, A.R.

    1992-01-01

    The accomplishment of functional productivity for the computer aided engineering (CAE) environment at the magnet engineering department (ME) of the magnet systems division (MSD) at the Superconducting Super Collider Laboratory (SSCL) involves most of the basic aspects of information engineering. It is highly desirable to arrive at a software and hardware topology that offers total, two-way (back and forth), automatic and direct software and hardware connectivity among computer-aided design and drafting (CADD), analysis codes, and office automation tools applicable to the disciplines involved. This paper describes the components, data flow, and practices employed in the development of the CAE environment from a systems engineering aspect rather than from the analytical angle. When appropriate, references to case studies are made in order to demonstrate the connectivity of the techniques used

  7. Connectivity among computer-aided engineering methods, procedures, and tools used in developing the SSC collider magnets

    International Nuclear Information System (INIS)

    Kallas, N.; Jalloh, A.R.

    1992-03-01

    The accomplishment of functional productivity for the computer aided engineering (CAE) environment at the magnet engineering department (ME) of the magnet systems divisions (MSD) at the Superconducting Super Collider Laboratory (SSCL) involves most of the basic aspects of information engineering. It is highly desirable to arrive at a software and hardware topology that offers total, two-way (back and forth), automatic and direct software and hardware connectivity among computer-aided design and drafting (CADD), analysis codes, and office automation tools applicable to the disciplines involved. This paper describes the components, data flow, and practices employed in the development of the CAE environment from a systems engineering aspect rather than from the analytical angle. When appropriate, references to case studies are made in order to demonstrate the connectivity of the techniques used

  8. Analytical Models Development of Compact Monopole Vortex Flows

    Directory of Open Access Journals (Sweden)

    Pavlo V. Lukianov

    2017-09-01

    Conclusions. The article contains series of the latest analytical models that describe both laminar and turbulent dynamics of monopole vortex flows which have not been reflected in traditional publications up to the present. The further research must be directed to search of analytical models for the coherent vortical structures in flows of viscous fluids, particularly near curved surfaces, where known in hydromechanics “wall law” is disturbed and heat and mass transfer anomalies take place.

  9. Development of culturally sensitive dialog tools in diabetes education

    Directory of Open Access Journals (Sweden)

    Nana Folmann Hempler

    2015-01-01

    Full Text Available Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptability among ethnic minority groups. The process also emphasizes the importance of adequate training and competencies in the application of dialog tools and of alignment between researchers and health care professionals with regards to the educational philosophy underlying their use.

  10. Analytical chemistry of actinides

    International Nuclear Information System (INIS)

    Chollet, H.; Marty, P.

    2001-01-01

    Different characterization methods specifically applied to the actinides are presented in this review such as ICP/OES (inductively coupled plasma-optical emission spectrometry), ICP/MS (inductively coupled plasma spectroscopy-mass spectrometry), TIMS (thermal ionization-mass spectrometry) and GD/OES (flow discharge optical emission). Molecular absorption spectrometry and capillary electrophoresis are also available to complete the excellent range of analytical tools at our disposal. (authors)

  11. The Bristol Radiology Report Assessment Tool (BRRAT): developing a workplace-based assessment tool for radiology reporting skills.

    Science.gov (United States)

    Wallis, A; Edey, A; Prothero, D; McCoubrie, P

    2013-11-01

    To review the development of a workplace-based assessment tool to assess the quality of written radiology reports and assess its reliability, feasibility, and validity. A comprehensive literature review and rigorous Delphi study enabled the development of the Bristol Radiology Report Assessment Tool (BRRAT), which consists of 19 questions and a global assessment score. Three assessors applied the assessment tool to 240 radiology reports provided by 24 radiology trainees. The reliability coefficient for the 19 questions was 0.79 and the equivalent coefficient for the global assessment scores was 0.67. Generalizability coefficients demonstrate that higher numbers of assessors and assessments are needed to reach acceptable levels of reliability for summative assessments due to assessor subjectivity. The study methodology gives good validity and strong foundation in best-practice. The assessment tool developed for radiology reporting is reliable and most suited to formative assessments. Copyright © 2013 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  12. Effects of Learning Analytics Dashboard: Analyzing the Relations among Dashboard Utilization, Satisfaction, and Learning Achievement

    Science.gov (United States)

    Kim, Jeonghyun; Jo, Il-Hyun; Park, Yeonjeong

    2016-01-01

    The learning analytics dashboard (LAD) is a newly developed learning support tool for virtual classrooms that is believed to allow students to review their online learning behavior patterns intuitively through the provision of visual information. The purpose of this study was to empirically validate the effects of LAD. An experimental study was…

  13. Implementation of a communication and control network for the instruments of a nuclear analytical laboratory

    International Nuclear Information System (INIS)

    Cunya, Eduardo; Baltuano, Oscar; Bedregal, Patricia

    2013-01-01

    This paper describes the implementation of a communication network and control for a conventional laboratory instruments and nuclear analytical processes based on CAN open field bus to control devices and machines. Hardware components and software developed as well as installation and configuration tools for incorporating new instruments to the network re presented. (authors).

  14. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    Science.gov (United States)

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. USING ANALYTIC HIERARCHY PROCESS (AHP METHOD IN RURAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Tülay Cengiz

    2003-04-01

    Full Text Available Rural development is a body of economical and social policies towards improving living conditions in rural areas through enabling rural population to utilize economical, social, cultural and technological blessing of city life in place, without migrating. As it is understood from this description, rural development is a very broad concept. Therefore, in development efforts problem should be stated clearly, analyzed and many criterias should be evaluated by experts. Analytic Hierarchy Process (AHP method can be utilized at there stages of development efforts. AHP methods is one of multi-criteria decision method. After degrading a problem in smaller pieces, relative importance and level of importance of two compared elements are determined. It allows evaluation of quality and quantity factors. At the same time, it permits utilization of ideas of many experts and use them in decision process. Because mentioned features of AHP method, it could be used in rural development works. In this article, cultural factors, one of the important components of rural development is often ignored in many studies, were evaluated as an example. As a result of these applications and evaluations, it is concluded that AHP method could be helpful in rural development efforts.

  16. Managing design excellence tools during the development of new orthopaedic implants.

    Science.gov (United States)

    Défossez, Henri J P; Serhan, Hassan

    2013-11-01

    Design excellence (DEX) tools have been widely used for years in some industries for their potential to facilitate new product development. The medical sector, targeted by cost pressures, has therefore started adopting them. Numerous tools are available; however only appropriate deployment during the new product development stages can optimize the overall process. The primary study objectives were to describe generic tools and illustrate their implementation and management during the development of new orthopaedic implants, and compile a reference package. Secondary objectives were to present the DEX tool investment costs and savings, since the method can require significant resources for which companies must carefully plan. The publicly available DEX method "Define Measure Analyze Design Verify Validate" was adopted and implemented during the development of a new spinal implant. Several tools proved most successful at developing the correct product, addressing clinical needs, and increasing market penetration potential, while reducing design iterations and manufacturing validations. Cost analysis and Pugh Matrix coupled with multi generation planning enabled developing a strong rationale to activate the project, set the vision and goals. improved risk management and product map established a robust technical verification-validation program. Design of experiments and process quantification facilitated design for manufacturing of critical features, as early as the concept phase. Biomechanical testing with analysis of variance provided a validation model with a recognized statistical performance baseline. Within those tools, only certain ones required minimum resources (i.e., business case, multi generational plan, project value proposition, Pugh Matrix, critical To quality process validation techniques), while others required significant investments (i.e., voice of customer, product usage map, improved risk management, design of experiments, biomechanical testing

  17. Development of the Operational Events Groups Ranking Tool

    International Nuclear Information System (INIS)

    Simic, Zdenko; Banov, Reni

    2014-01-01

    Both because of complexity and ageing, facilities like nuclear power plants require feedback from the operating experience in order to further improve safety and operation performance. That is the reason why significant effort is dedicated to operating experience feedback. This paper contains description of the specification and development of the application for the operating events ranking software tool. Robust and consistent way of selecting most important events for detail investigation is important because it is not feasible or even useful to investigate all of them. Development of the tool is based on the comprehensive events characterisation and methodical prioritization. This includes rich set of events parameters which allow their top level preliminary analysis, different ways of groupings and even to evaluate uncertainty propagation to the ranking results. One distinct feature of the implemented method is that user (i.e., expert) could determine how important is particular ranking parameter based on their pairwise comparison. For tools demonstration and usability it is crucial that sample database is also created. For useful analysis the whole set of events for 5 years is selected and characterised. Based on the preliminary results this tool seems valuable for new preliminary prospective on data as whole, and especially for the identification of events groups which should have priority in the more detailed assessment. The results are consisting of different informative views on the events groups importance and related sensitivity and uncertainty results. This presents valuable tool for improving overall picture about specific operating experience and also for helping identify the most important events groups for further assessment. It is clear that completeness and consistency of the input data characterisation is very important to get full and valuable importance ranking. Method and tool development described in this paper is part of continuous effort of

  18. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  19. Learning analytics as a tool for closing the assessment loop in higher education

    OpenAIRE

    Karen D. Mattingly; Margaret C. Rice; Zane L. Berge

    2012-01-01

    This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and wha...

  20. Developing a Support Tool for Global Product Development Decisions

    DEFF Research Database (Denmark)

    Søndergaard, Erik Stefan; Ahmed-Kristensen, Saeema

    2016-01-01

    This paper investigates how global product development decisions are made through a multiple-case study in three Danish engineering. The paper identifies which information and methods are applied for making decisions and how decision-making can be supported based on previous experience. The paper...... presents results from 51 decisions made in the three companies, and based on the results of the studies a framework for a decision-support tool is outlined and discussed. The paper rounds off with an identification of future research opportunities in the area of global product development and decision-making....

  1. The Role of Epigenomics in the Study of Cancer Biomarkers and in the Development of Diagnostic Tools.

    Science.gov (United States)

    Verma, Mukesh

    2015-01-01

    Epigenetics plays a key role in cancer development. Genetics alone cannot explain sporadic cancer and cancer development in individuals with no family history or a weak family history of cancer. Epigenetics provides a mechanism to explain the development of cancer in such situations. Alterations in epigenetic profiling may provide important insights into the etiology and natural history of cancer. Because several epigenetic changes occur before histopathological changes, they can serve as biomarkers for cancer diagnosis and risk assessment. Many cancers may remain asymptomatic until relatively late stages; in managing the disease, efforts should be focused on early detection, accurate prediction of disease progression, and frequent monitoring. This chapter describes epigenetic biomarkers as they are expressed during cancer development and their potential use in cancer diagnosis and prognosis. Based on epigenomic information, biomarkers have been identified that may serve as diagnostic tools; some such biomarkers also may be useful in identifying individuals who will respond to therapy and survive longer. The importance of analytical and clinical validation of biomarkers is discussed, along with challenges and opportunities in this field.

  2. Data mining and business analytics with R

    CERN Document Server

    Ledolter, Johannes

    2013-01-01

    Collecting, analyzing, and extracting valuable information from a large amount of data requires easily accessible, robust, computational and analytical tools. Data Mining and Business Analytics with R utilizes the open source software R for the analysis, exploration, and simplification of large high-dimensional data sets. As a result, readers are provided with the needed guidance to model and interpret complicated data and become adept at building powerful models for prediction and classification. Highlighting both underlying concepts and practical computational skills, Data Mining

  3. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  4. Development of a smart city planning support tool using the cooperative method

    Directory of Open Access Journals (Sweden)

    Takeshi Kobayashi

    2015-12-01

    Full Text Available A reduction of environmental burdens is currently required. In particular, proposing a new approach for the construction of a smart city using renewable energy is important. The technological development of a smart city is founded building equipment and infrastructure. However, planning methods and their techniques using the collaboration approach with residents are only just developing. This study aimed to develop a support tool for the construction of a smart city using renewable energy while facilitating consensus-building efforts among residents using the method for a cooperative housing development. We organized the supporting methods for the construction of residential area using the cooperative method. Then, we developed supporting tools that interface the computer with these methods. We examined the support techniques for the construction of a residential area using renewable energy technology by analyzing Japanese cases of a smart city. Moreover, we developed a support tool for the construction of a smart city on a trial basis. We integrated the smart city construction tools and the cooperative housing construction support tool. This tool has a 3D modeling system that helps residents to easily understand the space image as a result of the examination. We also developed a professional supporting tool that residents can consider for cost-effectiveness in renewable energy and its environmental load reduction rate for the planning of a smart city.

  5. Analytical development and optimization of a graphene–solution interface capacitance model

    Directory of Open Access Journals (Sweden)

    Hediyeh Karimi

    2014-05-01

    Full Text Available Graphene, which as a new carbon material shows great potential for a range of applications because of its exceptional electronic and mechanical properties, becomes a matter of attention in these years. The use of graphene in nanoscale devices plays an important role in achieving more accurate and faster devices. Although there are lots of experimental studies in this area, there is a lack of analytical models. Quantum capacitance as one of the important properties of field effect transistors (FETs is in our focus. The quantum capacitance of electrolyte-gated transistors (EGFETs along with a relevant equivalent circuit is suggested in terms of Fermi velocity, carrier density, and fundamental physical quantities. The analytical model is compared with the experimental data and the mean absolute percentage error (MAPE is calculated to be 11.82. In order to decrease the error, a new function of E composed of α and β parameters is suggested. In another attempt, the ant colony optimization (ACO algorithm is implemented for optimization and development of an analytical model to obtain a more accurate capacitance model. To further confirm this viewpoint, based on the given results, the accuracy of the optimized model is more than 97% which is in an acceptable range of accuracy.

  6. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  7. Development of materials for the rapid manufacture of die cast tooling

    Science.gov (United States)

    Hardro, Peter Jason

    The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely

  8. Development of Sustainability Assessment Tool for Malaysian hydropower industry: A case study

    Science.gov (United States)

    Turan, Faiz Mohd; Johan, Kartina; Abu Sofian, Muhammad Irfan

    2018-04-01

    This research deals with the development of sustainability assessment tools as a medium to assess the performance of a hydropower project compliances towards sustainability practice. Since the increasing needs of implementing sustainability practice, developed countries are utilizing sustainability tools to achieve sustainable development goals. Its inception within ASEAN countries including Malaysia is still low. The problem with most tools developed from other countries is that it is not very comprehensive as well as its implementation factors are not suitable for the local environment that is not quantified. Hence, there is a need to develop a suitable sustainable assessment tool for the Malaysian hydropower industry to comply with the sustainable development goals as a bridging gap between the governor and the practitioner. The steps of achieving this goal is separated into several parts. The first part is to identify sustainable parameters from established tools as a model for comparison to enhance new parameters. The second stage is to convert equivalent quantification value from the model to the new developed tools. The last stage is to develop software program as a mean of gaining energy company feedback with systematic sustainable reporting from the surveyor so as to be able to integrate sustainability assessment, monitoring and reporting for self-improved reporting.

  9. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    Science.gov (United States)

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-01-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)][M. M. Block, Eur. Phys. J. C 68, 683 (2010)] allow us to write fully decoupled solutions for the singlet structure function F s (x,Q 2 ) and G(x,Q 2 ) as F s (x,Q 2 )=F s (F s0 (x 0 ),G 0 (x 0 )) and G(x,Q 2 )=G(F s0 (x 0 ),G 0 (x 0 )), where the x 0 are the Bjorken x values at Q 0 2 . Here F s and G are known functions--found using LO DGLAP splitting functions--of the initial boundary conditions F s0 (x)≡F s (x,Q 0 2 ) and G 0 (x)≡G(x,Q 0 2 ), i.e., the chosen starting functions at the virtuality Q 0 2 . For both G(x) and F s (x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy--a computational fractional precision of O(10 -9 ). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F s distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)], starting from their initial values at Q 0 2 =1 GeV 2 and 1.69 GeV 2 , respectively, using their choice of α s (Q 2 ). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and F s satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of the starting functions on the evolved gluon and singlet structure functions, as functions of both Q

  11. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  12. Online programs as tools to improve parenting: A meta-analytic review

    NARCIS (Netherlands)

    Nieuwboer, C.C.; Fukkink, R.G.; Hermanns, J.M.A.

    2013-01-01

    Background: A number of parenting programs, aimed at improving parenting competencies, have recently been adapted or designed with the use of online technologies. Although web-based services have been claimed to hold promise for parent support, a meta-analytic review of online parenting

  13. Group Analytic Psychotherapy in Brazil.

    Science.gov (United States)

    Penna, Carla; Castanho, Pablo

    2015-10-01

    Group analytic practice in Brazil began quite early. Highly influenced by the Argentinean Pichon-Rivière, it enjoyed a major development from the 1950s to the early 1980s. Beginning in the 1970s, different factors undermined its development and eventually led to its steep decline. From the mid 1980s on, the number of people looking for either group analytic psychotherapy or group analytic training decreased considerably. Group analytic psychotherapy societies struggled to survive and most of them had to close their doors in the 1990s and the following decade. Psychiatric reform and the new public health system have stimulated a new demand for groups in Brazil. Developments in the public and not-for-profit sectors, combined with theoretical and practical research in universities, present promising new perspectives for group analytic psychotherapy in Brazil nowadays.

  14. SNL software manual for the ACS Data Analytics Project.

    Energy Technology Data Exchange (ETDEWEB)

    Stearley, Jon R.; McLendon, William Clarence, III; Rodrigues, Arun F.; Williams, Aaron S.; Hooper, Russell Warren; Robinson, David Gerald; Stickland, Michael G.

    2011-10-01

    In the ACS Data Analytics Project (also known as 'YumYum'), a supercomputer is modeled as a graph of components and dependencies, jobs and faults are simulated, and component fault rates are estimated using the graph structure and job pass/fail outcomes. This report documents the successful completion of all SNL deliverables and tasks, describes the software written by SNL for the project, and presents the data it generates. Readers should understand what the software tools are, how they fit together, and how to use them to reproduce the presented data and additional experiments as desired. The SNL YumYum tools provide the novel simulation and inference capabilities desired by ACS. SNL also developed and implemented a new algorithm, which provides faster estimates, at finer component granularity, on arbitrary directed acyclic graphs.

  15. The National Shipbuilding Research Program. Development of a Quick TBT Analytical Method

    Science.gov (United States)

    2000-08-16

    Development of a Quick TBT Analytical Method 09/25/2000 Page 3 of 38 Executive Summary Concern about the toxic effect of tributyltin have caused the...Antifouling Paints on the Environment Tributyl tin ( TBT ) has been shown to be highly toxic to certain aquatic organisms at concentrations measured in the...paints, developed in the 1960s, contains the organotin tributyltin ( TBT ), which has been proven to cause deformations in oysters and sex changes in

  16. The Development of a Tool for Sustainable Building Design:

    DEFF Research Database (Denmark)

    Tine Ring Hansen, Hanne; Knudstrup, Mary-Ann

    2009-01-01

    for sustainable buildings, as well as, an analysis of the relationship between the different approaches (e.g. low-energy, environmental, green building, solar architecture, bio-climatic architecture etc.) to sustainable building design and these indicators. The paper furthermore discusses how sustainable......The understanding of sustainable building has changed over time along with the architectural interpretation of sustainability. The paper presents the results of a comparative analysis of the indicators found in different internationally acclaimed and Danish certification schemes and standards...... architecture will gain more focus in the coming years, thus, establishing the need for the development of a new tool and methodology, The paper furthermore describes the background and considerations involved in the development of a design support tool for sustainable building design. A tool which considers...

  17. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  18. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  19. Supporting interactive visual analytics of energy behavior in buildings through affine visualizations

    DEFF Research Database (Denmark)

    Nielsen, Matthias; Brewer, Robert S.; Grønbæk, Kaj

    2016-01-01

    Domain experts dealing with big data are typically not familiar with advanced data mining tools. This especially holds true for domain experts within energy management. In this paper, we introduce a visual analytics approach that empowers such users to visually analyze energy behavior based......Viz, that interactively maps data from real world buildings. It is an overview +detail inter-active visual analytics tool supporting both rapid ad hoc explorations and structured evaluation of hypotheses about patterns and anomalies in resource consumption data mixed with occupant survey data. We have evaluated...... the approach with five domain experts within energy management, and further with 10 data analytics experts and found that it was easily attainable and that it supported visual analysis of mixed consumption and survey data. Finally, we discuss future perspectives of affine visual analytics for mixed...

  20. Technical Reviews on Pattern Recognition in Process Analytical Technology

    International Nuclear Information System (INIS)

    Kim, Jong Yun; Choi, Yong Suk; Ji, Sun Kyung; Park, Yong Joon; Song, Kyu Seok; Jung, Sung Hee

    2008-12-01

    Pattern recognition is one of the first and the most widely adopted chemometric tools among many active research area in chemometrics such as design of experiment(DoE), pattern recognition, multivariate calibration, signal processing. Pattern recognition has been used to identify the origin of a wine and the time of year that the vine was grown by using chromatography, cause of fire by using GC/MS chromatography, detection of explosives and land mines, cargo and luggage inspection in seaports and airports by using a prompt gamma-ray activation analysis, and source apportionment of environmental pollutant by using a stable isotope ratio mass spectrometry. Recently, pattern recognition has been taken into account as a major chemometric tool in the so-called 'process analytical technology (PAT)', which is a newly-developed concept in the area of process analytics proposed by US Food and Drug Administration (US FDA). For instance, identification of raw material by using a pattern recognition analysis plays an important role for the effective quality control of the production process. Recently, pattern recognition technique has been used to identify the spatial distribution and uniformity of the active ingredients present in the product such as tablet by transforming the chemical data into the visual information

  1. The effects of a learning analytics empowered technology on students' arithmetic skill development

    NARCIS (Netherlands)

    Molenaar, I.; Knoop-van Campen, C.A.N.; Hasselman, F.W.

    2017-01-01

    Learning analytics empowered educational technologies (LA-ET) in primary classrooms allow for blended learning scenarios with teacher-lead instructions, class-paced and individually-paced practice. This quasi-experimental study investigates the effects of a LA-ET on the development of students'

  2. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Ganapathi Iyer, S.; Ali, M.M.; Thantry, S.S.; Verma, R.; Arunachalam, J.; Walvekar, A.P.

    1992-01-01

    In 1976 Indian Researchers suggested the possible use of hair as an indicator of environmental exposure and established through a study of country wide student population and general population of the metropolitan city of Bombay that human scalp hair could indeed be an effective first level monitor in a scheme of multilevel monitoring of environmental exposure to inorganic pollutants. It was in this context and in view of the ready availability of large quantities of scalp hair subjected to minimum treatment by chemicals that they proposed to participate in the preparation of a standard material of hair. It was also recognized that measurements of trace element concentrations at very low levels require cross-validation by different analytical techniques, even within the same laboratory. The programme of work that has been carried out since the first meeting of the CRP had been aimed at these two objectives. These objectives include the preparation of standard material of hair and the development of analytical methodologies for determination of elements and species of interest. 1 refs., 3 tabs

  3. Analytic hierarchy process analysis for choosing a corporate social entrepreneurship strategy

    Directory of Open Access Journals (Sweden)

    Hadad Shahrazad

    2015-10-01

    Full Text Available After conducting an extensive analysis of both the specialised literature and practice and identifying three types of corporate social entrepreneurship in my PhD thesis titled “Corporate social entrepreneurship - the new paradigm of reshaping and rethinking business”, I decided to determine which of the three approaches is mostly suited for the Romanian market. The three types of corporate social entrepreneurship: corporate social entrepreneurship as local development tool, corporate social entrepreneurship as market development tool, and corporate social entrepreneurship as transformational innovation tool were organised as the alternatives of a carefully constructed hierarchy having as criteria: return on investment (which does not necessarily refer to the money that the company invests in the strategy; the term is derived from sustainability and scalability, degree of novelty, pre-entry knowledge and interest in solving the communities’ social problems. The questionnaire constructed based on the hierarchy using analytic hierarchy processes was distributed to experts (business developers coming from the following industries or sectors: beverages, IT, banking, furniture, and automotive. The research reveals which is the approach most likely to be employed by Romanian business developers. The results may be inferred to the sum of businesses represented by the expert business developers who were part of the research.

  4. An Analytic Approach to Developing Transport Threshold Models of Neoclassical Tearing Modes in Tokamaks

    International Nuclear Information System (INIS)

    Mikhailovskii, A.B.; Shirokov, M.S.; Konovalov, S.V.; Tsypin, V.S.

    2005-01-01

    Transport threshold models of neoclassical tearing modes in tokamaks are investigated analytically. An analysis is made of the competition between strong transverse heat transport, on the one hand, and longitudinal heat transport, longitudinal heat convection, longitudinal inertial transport, and rotational transport, on the other hand, which leads to the establishment of the perturbed temperature profile in magnetic islands. It is shown that, in all these cases, the temperature profile can be found analytically by using rigorous solutions to the heat conduction equation in the near and far regions of a chain of magnetic islands and then by matching these solutions. Analytic expressions for the temperature profile are used to calculate the contribution of the bootstrap current to the generalized Rutherford equation for the island width evolution with the aim of constructing particular transport threshold models of neoclassical tearing modes. Four transport threshold models, differing in the underlying competing mechanisms, are analyzed: collisional, convective, inertial, and rotational models. The collisional model constructed analytically is shown to coincide exactly with that calculated numerically; the reason is that the analytical temperature profile turns out to be the same as the numerical profile. The results obtained can be useful in developing the next generation of general threshold models. The first steps toward such models have already been made

  5. Development of an analytical microbial consortia method for enhancing performance monitoring at aerobic wastewater treatment plants.

    Science.gov (United States)

    Razban, Behrooz; Nelson, Kristina Y; McMartin, Dena W; Cullimore, D Roy; Wall, Michelle; Wang, Dunling

    2012-01-01

    An analytical method to produce profiles of bacterial biomass fatty acid methyl esters (FAME) was developed employing rapid agitation followed by static incubation (RASI) using selective media of wastewater microbial communities. The results were compiled to produce a unique library for comparison and performance analysis at a Wastewater Treatment Plant (WWTP). A total of 146 samples from the aerated WWTP, comprising 73 samples of each secondary and tertiary effluent, were included analyzed. For comparison purposes, all samples were evaluated via a similarity index (SI) with secondary effluents producing an SI of 0.88 with 2.7% variation and tertiary samples producing an SI 0.86 with 5.0% variation. The results also highlighted significant differences between the fatty acid profiles of the tertiary and secondary effluents indicating considerable shifts in the bacterial community profile between these treatment phases. The WWTP performance results using this method were highly replicable and reproducible indicating that the protocol has potential as a performance-monitoring tool for aerated WWTPs. The results quickly and accurately reflect shifts in dominant bacterial communities that result when processes operations and performance change.

  6. User Studies: Developing Learning Strategy Tool Software for Children.

    Science.gov (United States)

    Fitzgerald, Gail E.; Koury, Kevin A.; Peng, Hsinyi

    This paper is a report of user studies for developing learning strategy tool software for children. The prototype software demonstrated is designed for children with learning and behavioral disabilities. The tools consist of easy-to-use templates for creating organizational, memory, and learning approach guides for use in classrooms and at home.…

  7. Developing and Validating a New Classroom Climate Observation Assessment Tool.

    Science.gov (United States)

    Leff, Stephen S; Thomas, Duane E; Shapiro, Edward S; Paskewich, Brooke; Wilson, Kim; Necowitz-Hoffman, Beth; Jawad, Abbas F

    2011-01-01

    The climate of school classrooms, shaped by a combination of teacher practices and peer processes, is an important determinant for children's psychosocial functioning and is a primary factor affecting bullying and victimization. Given that there are relatively few theoretically-grounded and validated assessment tools designed to measure the social climate of classrooms, our research team developed an observation tool through participatory action research (PAR). This article details how the assessment tool was designed and preliminarily validated in 18 third-, fourth-, and fifth-grade classrooms in a large urban public school district. The goals of this study are to illustrate the feasibility of a PAR paradigm in measurement development, ascertain the psychometric properties of the assessment tool, and determine associations with different indices of classroom levels of relational and physical aggression.

  8. Importance of implementing an analytical quality control system in a core laboratory.

    Science.gov (United States)

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  9. An Integrated Development Tool for a safety application using FBD language

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jun; Lee, Jang Soo; Lee, Dong Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Regarding digitalizing the Nuclear Instrumentation and Control Systems, the application program responsible for the safety functions of Nuclear I and C Systems shall ensure the robustness of the safety function through development, testing, and validation roles for a life cycle process during software development. The importance of software in nuclear systems increases continuously. The integrated engineering tools to develop, test, and validate safety application programs require increasingly more complex parts among a number of components within nuclear digital I and C systems. This paper introduces the integrated engineering tool (SafeCASE-PLC) developed by our project. The SafeCASE-PLC is a kind of software engineering tool to develop, test, and validate the nuclear application program performed in an automatic controller

  10. Development and implementation of information systems for the DOE's National Analytical Management Program (NAMP)

    International Nuclear Information System (INIS)

    Streets, W. E.

    1999-01-01

    The Department of Energy (DOE) faces a challenging environmental management effort, including environmental protection, environmental restoration, waste management, and decommissioning. This effort requires extensive sampling and analysis to determine the type and level of contamination and the appropriate technology for cleanup, and to verify compliance with environmental regulations. Data obtained from these sampling and analysis activities are used to support environmental management decisions. Confidence in the data is critical, having legal, regulatory, and therefore, economic impact. To promote quality in the planning, management, and performance of these sampling and analysis operations, DOE's Office of Environmental Management (EM) has established the National Analytical Management Program (NAMP). With a focus on reducing the estimated costs of over $200M per year for EM's analytical services, NAMP has been charged with developing products that will decrease the costs for DOE complex-wide environmental management while maintaining quality in all aspects of the analytical data generation. As part of this thrust to streamline operations, NAMP is developing centralized information systems that will allow DOE complex personnel to share information about EM contacts at the various sites, pertinent methodologies for environmental restoration and waste management, costs of analyses, and performance of contracted laboratories

  11. Developing the role of big data and analytics in health professional education.

    Science.gov (United States)

    Ellaway, Rachel H; Pusic, Martin V; Galbraith, Robert M; Cameron, Terri

    2014-03-01

    As we capture more and more data about learners, their learning, and the organization of their learning, our ability to identify emerging patterns and to extract meaning grows exponentially. The insights gained from the analyses of these large amounts of data are only helpful to the extent that they can be the basis for positive action such as knowledge discovery, improved capacity for prediction, and anomaly detection. Big Data involves the aggregation and melding of large and heterogeneous datasets while education analytics involves looking for patterns in educational practice or performance in single or aggregate datasets. Although it seems likely that the use of education analytics and Big Data techniques will have a transformative impact on health professional education, there is much yet to be done before they can become part of mainstream health professional education practice. If health professional education is to be accountable for its programs run and are developed, then health professional educators will need to be ready to deal with the complex and compelling dynamics of analytics and Big Data. This article provides an overview of these emerging techniques in the context of health professional education.

  12. Development and Testing of the Church Environment Audit Tool.

    Science.gov (United States)

    Kaczynski, Andrew T; Jake-Schoffman, Danielle E; Peters, Nathan A; Dunn, Caroline G; Wilcox, Sara; Forthofer, Melinda

    2018-05-01

    In this paper, we describe development and reliability testing of a novel tool to evaluate the physical environment of faith-based settings pertaining to opportunities for physical activity (PA) and healthy eating (HE). Tool development was a multistage process including a review of similar tools, stakeholder review, expert feedback, and pilot testing. Final tool sections included indoor opportunities for PA, outdoor opportunities for PA, food preparation equipment, kitchen type, food for purchase, beverages for purchase, and media. Two independent audits were completed at 54 churches. Interrater reliability (IRR) was determined with Kappa and percent agreement. Of 218 items, 102 were assessed for IRR and 116 could not be assessed because they were not present at enough churches. Percent agreement for all 102 items was over 80%. For 42 items, the sample was too homogeneous to assess Kappa. Forty-six of the remaining items had Kappas greater than 0.60 (25 items 0.80-1.00; 21 items 0.60-0.79), indicating substantial to almost perfect agreement. The tool proved reliable and efficient for assessing church environments and identifying potential intervention points. Future work can focus on applications within faith-based partnerships to understand how church environments influence diverse health outcomes.

  13. Modeling decision making as a support tool for policy making on renewable energy development

    International Nuclear Information System (INIS)

    Cannemi, Marco; García-Melón, Mónica; Aragonés-Beltrán, Pablo; Gómez-Navarro, Tomás

    2014-01-01

    This paper presents the findings of a study on decision making models for the analysis of capital-risk investors’ preferences on biomass power plants projects. The aim of the work is to improve the support tools for policy makers in the field of renewable energy development. Analytic Network Process (ANP) helps to better understand capital-risk investors preferences towards different kinds of biomass fueled power plants. The results of the research allow public administration to better foresee the investors’ reaction to the incentive system, or to modify the incentive system to better drive investors’ decisions. Changing the incentive system is seen as major risk by investors. Therefore, public administration must design better and longer-term incentive systems, forecasting market reactions. For that, two scenarios have been designed, one showing a typical decision making process and another proposing an improved decision making scenario. A case study conducted in Italy has revealed that ANP allows understanding how capital-risk investors interpret the situation and make decisions when investing on biomass power plants; the differences between the interests of public administrations’s and promoters’, how decision making could be influenced by adding new decision criteria, and which case would be ranked best according to the decision models. - Highlights: • We applied ANP to the investors’ preferences on biomass power plants projects. • The aim is to improve the advising tools for renewable energy policy making. • A case study has been carried out with the help of two experts. • We designed two scenarios: decision making as it is and how could it be improved. • Results prove ANP is a fruitful tool enhancing participation and transparency

  14. Value engineering on the designed operator work tools for brick and rings wells production

    Science.gov (United States)

    Ayu Bidiawati J., R.; Muchtiar, Yesmizarti; Wariza, Ragil Okta

    2017-06-01

    Operator working tools in making brick and ring wells were designed and made, and the value engineering was calculated to identify and develop the function of these tools in obtaining the balance between cost, reliability and appearance. This study focused on the value of functional components of the tools and attempted to increase the difference between the costs incurred by the generated values. The purpose of this study was to determine the alternatives of tools design and to determine the performance of each alternative. The technique was developed using FAST method that consisted of five stages: information, creative, analytical, development and presentation stage. The results of the analysis concluded that the designed tools have higher value and better function description. There were four alternative draft improvements for operator working tools. The best alternative was determined based on the rank by using matrix evaluation. Best performance was obtained by the alternative II, amounting to 98.92 with a value of 0.77.

  15. Big Data Analytics and Machine Intelligence Capability Development at NASA Langley Research Center: Strategy, Roadmap, and Progress

    Science.gov (United States)

    Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward

    2016-01-01

    In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.

  16. Experimental and analytical combined thermal approach for local tribological understanding in metal cutting

    International Nuclear Information System (INIS)

    Artozoul, Julien; Lescalier, Christophe; Dudzinski, Daniel

    2015-01-01

    Metal cutting is a highly complex thermo-mechanical process. The knowledge of temperature in the chip forming zone is essential to understand it. Conventional experimental methods such as thermocouples only provide global information which is incompatible with the high stress and temperature gradients met in the chip forming zone. Field measurements are essential to understand the localized thermo-mechanical problem. An experimental protocol has been developed using advanced infrared imaging in order to measure temperature distribution in both the tool and the chip during an orthogonal or oblique cutting operation. It also provides several information on the chip formation process such as some geometrical characteristics (tool-chip contact length, chip thickness, primary shear angle) and thermo-mechanical information (heat flux dissipated in deformation zone, local interface heat partition ratio). A study is carried out on the effects of cutting conditions i.e. cutting speed, feed and depth of cut on the temperature distribution along the contact zone for an elementary operation. An analytical thermal model has been developed to process experimental data and access more information i.e. local stress or heat flux distribution. - Highlights: • A thermal analytical model is proposed for orthogonal cutting process. • IR thermography is used during cutting tests. • Combined experimental and modeling approaches are applied. • Heat flux and stress distribution at the tool-chip interface are determined. • The decomposition into sticking and sliding zones is defined.

  17. Getting started with Greenplum for big data analytics

    CERN Document Server

    Gollapudi, Sunila

    2013-01-01

    Standard tutorial-based approach.""Getting Started with Greenplum for Big Data"" Analytics is great for data scientists and data analysts with a basic knowledge of Data Warehousing and Business Intelligence platforms who are new to Big Data and who are looking to get a good grounding in how to use the Greenplum Platform. It's assumed that you will have some experience with database design and programming as well as be familiar with analytics tools like R and Weka.

  18. A shipboard comparison of analytic methods for ballast water compliance monitoring

    Science.gov (United States)

    Bradie, Johanna; Broeg, Katja; Gianoli, Claudio; He, Jianjun; Heitmüller, Susanne; Curto, Alberto Lo; Nakata, Akiko; Rolke, Manfred; Schillak, Lothar; Stehouwer, Peter; Vanden Byllaardt, Julie; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Promising approaches for indicative analysis of ballast water samples have been developed that require study in the field to examine their utility for determining compliance with the International Convention for the Control and Management of Ships' Ballast Water and Sediments. To address this gap, a voyage was undertaken on board the RV Meteor, sailing the North Atlantic Ocean from Mindelo (Cape Verde) to Hamburg (Germany) during June 4-15, 2015. Trials were conducted on local sea water taken up by the ship's ballast system at multiple locations along the trip, including open ocean, North Sea, and coastal water, to evaluate a number of analytic methods that measure the numeric concentration or biomass of viable organisms according to two size categories (≥ 50 μm in minimum dimension: 7 techniques, ≥ 10 μm and scientific approaches (e.g. flow cytometry). Several promising indicative methods were identified that showed high correlation with microscopy, but allow much quicker processing and require less expert knowledge. This study is the first to concurrently use a large number of analytic tools to examine a variety of ballast water samples on board an operational ship in the field. Results are useful to identify the merits of each method and can serve as a basis for further improvement and development of tools and methodologies for ballast water compliance monitoring.

  19. Development of a competency mapping tool for undergraduate professional degree programmes, using mechanical engineering as a case study

    Science.gov (United States)

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.

  20. Neutronic analyses and tools development efforts in the European DEMO programme

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, U., E-mail: ulrich.fischer@kit.edu [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Bachmann, C. [European Fusion Development Agreement (EFDA), Garching (Germany); Bienkowska, B. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Catalan, J.P. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Drozdowicz, K.; Dworak, D. [Association IPPLM-Euratom, IPPLM Warsaw/INP Krakow (Poland); Leichtle, D. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Fusion for Energy (F4E), Barcelona (Spain); Lengar, I. [MESCS-JSI, Ljubljana (Slovenia); Jaboulay, J.-C. [CEA, DEN, Saclay, DM2S, SERMA, F-91191 Gif-sur-Yvette (France); Lu, L. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Moro, F. [Associazione ENEA-Euratom, ENEA Fusion Division, Frascati (Italy); Mota, F. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Sanz, J. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Szieberth, M. [Budapest University of Technology and Economics (BME), Budapest (Hungary); Palermo, I. [Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT), Madrid (Spain); Pampin, R. [Fusion for Energy (F4E), Barcelona (Spain); Porton, M. [Euratom/CCFE Fusion Association, Culham Science Centre for Fusion Energy (CCFE), Culham (United Kingdom); Pereslavtsev, P. [Association KIT-Euratom, Karlsruhe Institute of Technology (KIT), Karlsruhe (Germany); Ogando, F. [Universidad Nacional de Educación a Distancia (UNED), Madrid (Spain); Rovni, I. [Budapest University of Technology and Economics (BME), Budapest (Hungary); and others

    2014-10-15

    Highlights: •Evaluation of neutronic tools for application to DEMO nuclear analyses. •Generation of a DEMO model for nuclear analyses based on MC calculations. •Nuclear analyses of the DEMO reactor equipped with a HCLL-type blanket. -- Abstract: The European Fusion Development Agreement (EFDA) recently launched a programme on Power Plant Physics and Technology (PPPT) with the aim to develop a conceptual design of a fusion demonstration reactor (DEMO) addressing key technology and physics issues. A dedicated part of the PPPT programme is devoted to the neutronics which, among others, has to define and verify requirements and boundary conditions for the DEMO systems. The quality of the provided data depends on the capabilities and the reliability of the computational tools. Accordingly, the PPPT activities in the area of neutronics include both DEMO nuclear analyses and development efforts on neutronic tools including their verification and validation. This paper reports on first neutronics studies performed for DEMO, and on the evaluation and further development of neutronic tools.