WorldWideScience

Sample records for valuable analytical tool

  1. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  2. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998; ......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing....

  3. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    /purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  4. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  5. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    Selecting the appropriate analytical methods for characterizing the assembly and morphology of polymer-based vesicles, or polymersomes are required to reach their full potential in biotechnology. This work presents and compares 17 different techniques for their ability to adequately report size....../purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  6. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  7. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  8. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Method for effective usage of Google Analytics tools

    Directory of Open Access Journals (Sweden)

    Ирина Николаевна Егорова

    2016-01-01

    Full Text Available Modern Google Analytics tools have been investigated against effective attraction channels for users and bottlenecks detection. Conducted investigation allowed to suggest modern method for effective usage of Google Analytics tools. The method is based on main traffic indicators analysis, as well as deep analysis of goals and their consecutive tweaking. Method allows to increase website conversion and might be useful for SEO and Web analytics specialists

  10. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  11. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  12. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Drachsler, Hendrik; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  13. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  14. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  15. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran; Jonsson, Daniel K.; Lundberg, Kristina; Tyskeng, Sara; Wallgren, Oskar

    2009-01-01

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  16. Motivational interviewing: a valuable tool for the psychiatric advanced practice nurse.

    Science.gov (United States)

    Karzenowski, Abby; Puskar, Kathy

    2011-01-01

    Motivational Interviewing (MI) is well known and respected by many health care professionals. Developed by Miller and Rollnick (2002) , it is a way to promote behavior change from within and resolve ambivalence. MI is individualized and is most commonly used in the psychiatric setting; it is a valuable tool for the Psychiatric Advanced Nurse Practice Nurse. There are many resources that talk about what MI is and the principles used to apply it. However, there is little information about how to incorporate MI into a clinical case. This article provides a summary of articles related to MI and discusses two case studies using MI and why advanced practice nurses should use MI with their patients.

  17. Hypnosis as a Valuable Tool for Surgical Procedures in the Oral and Maxillofacial Area.

    Science.gov (United States)

    Montenegro, Gil; Alves, Luiza; Zaninotto, Ana Luiza; Falcão, Denise Pinheiro; de Amorim, Rivadávio Fernandes Batista

    2017-04-01

    Hypnosis is a valuable tool in the management of patients who undergo surgical procedures in the maxillofacial complex, particularly in reducing and eliminating pain during surgery and aiding patients who have dental fear and are allergic to anesthesia. This case report demonstrates the efficacy of hypnosis in mitigating anxiety, bleeding, and pain during dental surgery without anesthesia during implant placement of tooth 14, the upper left first molar.

  18. Data mining and business analytics with R

    CERN Document Server

    Ledolter, Johannes

    2013-01-01

    Collecting, analyzing, and extracting valuable information from a large amount of data requires easily accessible, robust, computational and analytical tools. Data Mining and Business Analytics with R utilizes the open source software R for the analysis, exploration, and simplification of large high-dimensional data sets. As a result, readers are provided with the needed guidance to model and interpret complicated data and become adept at building powerful models for prediction and classification. Highlighting both underlying concepts and practical computational skills, Data Mining

  19. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  20. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  2. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  3. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  4. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  5. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  6. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  7. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  8. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  9. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  10. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    Science.gov (United States)

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  11. 3D-Printed specimens as a valuable tool in anatomy education: A pilot study.

    Science.gov (United States)

    Garas, Monique; Vaccarezza, Mauro; Newland, George; McVay-Doornbusch, Kylie; Hasani, Jamila

    2018-06-06

    Three-dimensional (3D) printing is a modern technique of creating 3D-printed models that allows reproduction of human structures from MRI and CT scans via fusion of multiple layers of resin materials. To assess feasibility of this innovative resource as anatomy educational tool, we conducted a preliminary study on Curtin University undergraduate students to investigate the use of 3D models for anatomy learning as a main goal, to assess the effectiveness of different specimen types during the sessions and personally preferred anatomy learning tools among students as secondary aim. The study consisted of a pre-test, exposure to test (anatomical test) and post-test survey. During pre-test, all participants (both without prior experience and experienced groups) were given a brief introduction on laboratory safety and study procedure thus participants were exposed to 3D, wet and plastinated specimens of the heart, shoulder and thigh to identify the pinned structures (anatomical test). Then, participants were provided a post-test survey containing five questions. In total, 23 participants completed the anatomical test and post-test survey. A larger number of participants (85%) achieved right answers for 3D models compared to wet and plastinated materials, 74% of population selected 3D models as the most usable tool for identification of pinned structures and 45% chose 3D models as their preferred method of anatomy learning. This preliminary small-size study affirms the feasibility of 3D-printed models as a valuable asset in anatomy learning and shows their capability to be used adjacent to cadaveric materials and other widely used tools in anatomy education. Copyright © 2018 Elsevier GmbH. All rights reserved.

  12. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  13. Professional regulation: a potentially valuable tool in responding to "stem cell tourism".

    Science.gov (United States)

    Zarzeczny, Amy; Caulfield, Timothy; Ogbogu, Ubaka; Bell, Peter; Crooks, Valorie A; Kamenova, Kalina; Master, Zubin; Rachul, Christen; Snyder, Jeremy; Toews, Maeghan; Zoeller, Sonja

    2014-09-09

    The growing international market for unproven stem cell-based interventions advertised on a direct-to-consumer basis over the internet ("stem cell tourism") is a source of concern because of the risks it presents to patients as well as their supporters, domestic health care systems, and the stem cell research field. Emerging responses such as public and health provider-focused education and national regulatory efforts are encouraging, but the market continues to grow. Physicians play a number of roles in the stem cell tourism market and, in many jurisdictions, are members of a regulated profession. In this article, we consider the use of professional regulation to address physician involvement in stem cell tourism. Although it is not without its limitations, professional regulation is a potentially valuable tool that can be employed in response to problematic types of physician involvement in the stem cell tourism market. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  15. Performance Marketing with Google Analytics Strategies and Techniques for Maximizing Online ROI

    CERN Document Server

    Tonkin, Sebastian

    2010-01-01

    An unparalleled author trio shares valuable advice for using Google Analytics to achieve your business goals. Google Analytics is a free tool used by millions of Web site owners across the globe to track how visitors interact with their Web sites, where they arrive from, and which visitors drive the most revenue and sales leads. This book offers clear explanations of practical applications drawn from the real world. The author trio of Google Analytics veterans starts with a broad explanation of performance marketing and gets progressively more specific, closing with step-by-step analysis and a

  16. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  17. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  18. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  19. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    Science.gov (United States)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  20. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  1. Professional Regulation: A Potentially Valuable Tool in Responding to “Stem Cell Tourism”

    Directory of Open Access Journals (Sweden)

    Amy Zarzeczny

    2014-09-01

    Full Text Available The growing international market for unproven stem cell-based interventions advertised on a direct-to-consumer basis over the internet (“stem cell tourism” is a source of concern because of the risks it presents to patients as well as their supporters, domestic health care systems, and the stem cell research field. Emerging responses such as public and health provider-focused education and national regulatory efforts are encouraging, but the market continues to grow. Physicians play a number of roles in the stem cell tourism market and, in many jurisdictions, are members of a regulated profession. In this article, we consider the use of professional regulation to address physician involvement in stem cell tourism. Although it is not without its limitations, professional regulation is a potentially valuable tool that can be employed in response to problematic types of physician involvement in the stem cell tourism market.

  2. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  3. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  4. Development of the oil spill response cost-effectiveness analytical tool

    International Nuclear Information System (INIS)

    Etkin, D.S.; Welch, J.

    2005-01-01

    Decision-making during oil spill response operations or contingency planning requires balancing the need to remove as much oil as possible from the environment with the desire to minimize the impact of response operations on the environment they are intended to protect. This paper discussed the creation of a computer tool developed to help in planning and decision-making during response operations. The Oil Spill Response Cost-Effectiveness Analytical Tool (OSRCEAT) was developed to compare the costs of response with the benefits of response in both hypothetical and actual oil spills. The computer-based analytical tool can assist responders and contingency planners in decision-making processes as well as act as a basis of discussion in the evaluation of response options. Using inputs on spill parameters, location and response options, OSRCEAT can calculate response cost, costs of environmental and socioeconomic impacts of the oil spill and response impacts. Oil damages without any response are contrasted to oil damages with response, with expected improvements. Response damages are subtracted from the difference in damages with and without response in order to derive a more accurate response benefit. An OSRCEAT user can test various response options to compare potential benefits in order to maximize response benefit. OSRCEAT is best used to compare and contrast the relative benefits and costs of various response options. 50 refs., 19 tabs., 2 figs

  5. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  6. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han

    2012-01-01

    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors

  7. Process analytical technology (PAT) tools for the cultivation step in biopharmaceutical production

    NARCIS (Netherlands)

    Streefland, M.; Martens, D.E.; Beuvery, E.C.; Wijffels, R.H.

    2013-01-01

    The process analytical technology (PAT) initiative is now 10 years old. This has resulted in the development of many tools and software packages dedicated to PAT application on pharmaceutical processes. However, most applications are restricted to small molecule drugs, mainly for the relatively

  8. MALDI TOF imaging mass spectrometry in clinical pathology: a valuable tool for cancer diagnostics (review).

    Science.gov (United States)

    Kriegsmann, Jörg; Kriegsmann, Mark; Casadonte, Rita

    2015-03-01

    Matrix-assisted laser desorption/ionization (MALDI) time-of-flight (TOF) imaging mass spectrometry (IMS) is an evolving technique in cancer diagnostics and combines the advantages of mass spectrometry (proteomics), detection of numerous molecules, and spatial resolution in histological tissue sections and cytological preparations. This method allows the detection of proteins, peptides, lipids, carbohydrates or glycoconjugates and small molecules.Formalin-fixed paraffin-embedded tissue can also be investigated by IMS, thus, this method seems to be an ideal tool for cancer diagnostics and biomarker discovery. It may add information to the identification of tumor margins and tumor heterogeneity. The technique allows tumor typing, especially identification of the tumor of origin in metastatic tissue, as well as grading and may provide prognostic information. IMS is a valuable method for the identification of biomarkers and can complement histology, immunohistology and molecular pathology in various fields of histopathological diagnostics, especially with regard to identification and grading of tumors.

  9. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  10. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    Science.gov (United States)

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  11. Physical-based analytical model of flexible a-IGZO TFTs accounting for both charge injection and transport

    NARCIS (Netherlands)

    Ghittorelli, M.; Torricelli, F.; van der Steen, J.-L.; Garripoli, C.; Tripathi, A.K.; Gelinck, G.; Cantatore, E.; Kovács-Vajna, Z.M.

    2015-01-01

    Here we show a new physical-based analytical model of a-IGZO TFTs. TFTs scaling from L=200 μm to L=15 μm and fabricated on plastic foil are accurately reproduced with a unique set of parameters. The model is used to design a zero- VGS inverter. It is a valuable tool for circuit design and technology

  12. Physical-based analytical model of flexible a-IGZO TFTs accounting for both charge injection and transport

    NARCIS (Netherlands)

    Ghittorelli, M.; Torricelli, F.; Steen, J.L. van der; Garripoli, C.; Tripathi, A.; Gelinck, G.H.; Cantatore, E.; Kovacs-Vajna, Z.M.

    2016-01-01

    Here we show a new physical-based analytical model of a-IGZO TFTs. TFTs scaling from L=200 μm to L=15 μm and fabricated on plastic foil are accurately reproduced with a unique set of parameters. The model is used to design a zero-VGS inverter. It is a valuable tool for circuit design and technology

  13. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  14. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated....... The polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus...... of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  15. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  16. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  17. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  18. Encyclopedia of analytical surfaces

    CERN Document Server

    Krivoshapko, S N

    2015-01-01

    This encyclopedia presents an all-embracing collection of analytical surface classes. It provides concise definitions  and description for more than 500 surfaces and categorizes them in 38 classes of analytical surfaces. All classes are cross references to the original literature in an excellent bibliography. The encyclopedia is of particular interest to structural and civil engineers and serves as valuable reference for mathematicians.

  19. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    Science.gov (United States)

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  20. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  1. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  2. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  3. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  4. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    Science.gov (United States)

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  5. Agrobacterium rhizogenes-mediated transformation of Superroot-derived Lotus corniculatus plants: a valuable tool for functional genomics

    Directory of Open Access Journals (Sweden)

    Liu Wei

    2009-06-01

    reach 92% based on GUS detection. The combination of the highly efficient transformation and the regeneration system of Superroot provides a valuable tool for functional genomics studies in L. corniculatus.

  6. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  7. The AACSB: A Valuable Tool for the Language Educator.

    Science.gov (United States)

    Bush-Bacelis, Jean L.

    The American Assembly of Collegiate Schools of Business (AACSB), an accrediting agency, may be an overlooked tool for establishing rationale and credibility for globalization of business courses. The 245 member institutions are bound by the agency's accrediting requirements, and many others are influenced by the standards set in those…

  8. Current research relevant to the improvement of γ-ray spectroscopy as an analytical tool

    International Nuclear Information System (INIS)

    Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.

    1976-01-01

    Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors

  9. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  10. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  11. ANALYTICAL MODEL FOR LATHE TOOL DISPLACEMENTS CALCULUS IN THE MANUFACTURING P ROCESS

    Directory of Open Access Journals (Sweden)

    Catălin ROŞU

    2014-01-01

    Full Text Available In this paper, we present an analytical model for lathe tools displacements calculus in the manufacturing process. We will present step by step the methodology for the displacements calculus and in the end we will insert these relations in a program for automatic calculus and we extract the conclusions. There is taken into account only the effects of the bending moments (because these insert the highest displacements. The simplifying assumptions and the calculus relations for the displacements (linea r and angular ones are presented in an original way.

  12. Data-Mining – A Valuable Managerial Tool for Improving Power Plants Efficiency

    Directory of Open Access Journals (Sweden)

    Danubianu Mirela

    2014-05-01

    Full Text Available Energy and environment are top priorities for the EU’s Europe 2020 Strategy. Both fields imply complex approaches and consistent investment. The paper presents an alternative to large investments to improve the efficiencies of existing (outdated power installations: namely the use of data-mining techniques for analysing existing operational data. Data-mining is based upon exhaustive analysis of operational records, inferring high-value information by simply processing records with advanced mathematical / statistical tools. Results can be: assessment of the consistency of measurements, identification of new hardware needed for improving the quality of data, deducing the most efficient level for operation (internal benchmarking, correlation of consumptions with power/ heat production, of technical parameters with environmental impact, scheduling the optimal maintenance time, fuel stock optimization, simulating scenarios for equipment operation, anticipating periods of maximal stress of equipment, identification of medium and long term trends, planning and decision support for new investment, etc. The paper presents a data mining process carried out at the TERMICA - Suceava power plant. The analysis calls for a multidisciplinary approach, a complex team (experts in power&heat production, mechanics, environmental protection, economists, and last but not least IT experts and can be carried out with lower expenses than an investment in new equipment. Involvement of top management of the company is essential, being the driving force and motivation source for the data-mining team. The approach presented is self learning as once established, the data-mining analytical, modelling and simulation procedures and associated parameter databases can adjust themselves by absorbing and processing new relevant information and can be used on a long term basis for monitoring the performance of the installation, certifying the soundness of managerial measures taken

  13. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  14. Teaching resources in speleology and karst: a valuable educational tool

    Directory of Open Access Journals (Sweden)

    De Waele Jo

    2010-01-01

    Full Text Available There is a growing need in the speleological community of tools that make teaching of speleology and karst much easier. Despite the existence of a wide range of major academic textbooks, often the caver community has a difficult access to such material. Therefore, to fill this gap, the Italian Speleological Society, under the umbrella of the Union International de Spéléologie, has prepared a set of lectures, in a presentation format, on several topics including geology, physics, chemistry, hydrogeology, mineralogy, palaeontology, biology, microbiology, history, archaeology, artificial caves, documentation, etc. These lectures constitute the “Teaching Resources in Speleology and Karst”, available online. This educational tool, thanks to its easily manageable format, can constantly be updated and enriched with new contents and topics.

  15. Analytical Tools for the Routine Use of the UMAE

    International Nuclear Information System (INIS)

    D'Auria, F.; Eramo, A.; Gianotti, W.

    1998-01-01

    UMAE (Uncertainty Methodology based on Accuracy Extrapolation) is methodology developed to calculate the uncertainties related to thermal-hydraulic code results in Nuclear Power Plant transient analysis. The use of the methodology has shown the need of specific analytical tools to simplify some steps of its application and making clearer individual procedures adopted in the development. Three of these have been recently completed and are illustrated in this paper. The first one makes it possible to attribute ''weight factors'' to the experimental Integral Test Facilities; results are also shown. The second one deals with the calculation of the accuracy of a code results. The concerned computer program makes a comparison between experimental and calculated trends of any quantity and gives as output the accuracy of the calculation. The third one consists in a computer program suitable to get continuous uncertainty bands from single valued points. (author)

  16. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  17. Technical session: the Atomika TXRF tool series

    International Nuclear Information System (INIS)

    Dobler, M. . URL: www.atomika.com

    2000-01-01

    ATOMIKA Instruments GmbH holds worldwide competence as a renowned producer of high-performance metrology tools and analytic devices. ATOMIKA's TXRF products are widely accepted for elemental contamination monitoring on semiconductor materials as well as in chemical analysis. More than 100 companies and institutes have their analytical work based on TXRF tools made by ATOMIKA Instruments. ATOMIKA's TXRF 8300W/82OOW wafer contamination monitors are the result of an evolution based on a background of 20 years of competence. Built for the semiconductor industry, the TXRF 8300W/82OOW detect rnetal contaminants on 300mm, or 200mm silicon wafer surfaces with highest possible sensitivity. Operating under ambient conditions, with a sealed x-ray tube, and having their own minienvironment (FOUP, or SMIF respectively), TXRF 8300W182OOW are optimally suited for in-line use. Fab automation (GEM/SECS) is supported by predefined measurement recipes and fully automatic routines. High throughput and uptimes, an ergonomic design according to SEMI standard plus an unrivaled small footprint of 1.1 m 2 make the TXRF 8300W/82OOW most efficient and economic solutions for industrial wafer monitoring. As the specific tool for multielement trace and thin layer analysis the ATOMIKA TXRF 8030C provides simultaneous and fast determination of alt elements within the range from sodium to uranium. Sophisticated measurement instrumentation provides detection limits down to the ppt range. On the other hand, performance is decisively facilitated by features as automatic switching of primary radiation, predefined measurement recipes, or software driven optimization of the entire measurement process. These features make the TXRF 8030C a valuable analytic tool for a wide range of applications: contamination in water, dust or sediments; quantitative screening in the chemical industry; toxic elements in tissues and biological fluids; radioactive elements; process chemicals in the semiconductor industry

  18. FIGURED WORLDS AS AN ANALYTIC AND METHODOLOGICAL TOOL IN PROFESSIONAL TEACHER DEVELOPMENT

    DEFF Research Database (Denmark)

    Møller, Hanne; Brok, Lene Storgaard

    . Hasse (2015) and Holland (1998) have inspired our study; i.e., learning is conceptualized as a social phenomenon, implying that contexts of learning are decisive for learner identity. The concept of Figured Worlds is used to understand the development and the social constitution of emergent interactions......,“(Holland et al., 1998, p. 52) and gives a framework for understanding meaning-making in particular pedagogical settings. We exemplify our use of the term Figured Worlds, both as an analytic and methodological tool for empirical studies in kindergarten and school. Based on data sources, such as field notes...

  19. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  20. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    Science.gov (United States)

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  1. Value Innovation in Learner-Centered Design. How to Develop Valuable Learning Tools

    Science.gov (United States)

    Breuer, Henning; Schwarz, Heinrich; Feller, Kristina; Matsumoto, Mitsuji

    2014-01-01

    This paper shows how to address technological, cultural and social transformations with empirically grounded innovation. Areas in transition such as higher education and learning techniques today bring about new needs and opportunities for innovative tools and services. But how do we find these tools? The paper argues for using a strategy of…

  2. Value Innovation in Learner-Centered Design. How to Develop Valuable Learning Tools.

    Directory of Open Access Journals (Sweden)

    Henning Breuer

    2014-02-01

    Full Text Available This paper shows how to address technological, cultural and social transformations with empirically grounded innovation. Areas in transition such as higher education and learning techniques today bring about new needs and opportunities for innovative tools and services. But how do we find these tools? The paper argues for using a strategy of (user value innovation that creatively combines ethnographic methods with strategic industry analysis. By focusing on unmet and emerging needs ethnographic research identifies learner values, needs and challenges but does not determine solutions. Blue-ocean strategy tools can identify new opportunities that alter existing offerings but give weak guidance on what will be most relevant to users. The triangulation of both is illustrated through an innovation project in higher education.

  3. PIXE as an analytical/educational tool

    International Nuclear Information System (INIS)

    Williams, E.T.

    1991-01-01

    The advantages and disadvantages of PIXE as an analytical method will be summarized. The authors will also discuss the advantages of PIXE as a means of providing interesting and feasible research projects for undergraduate science students

  4. Analytic number theory an introductory course

    CERN Document Server

    Bateman, Paul T

    2004-01-01

    This valuable book focuses on a collection of powerful methods ofanalysis that yield deep number-theoretical estimates. Particularattention is given to counting functions of prime numbers andmultiplicative arithmetic functions. Both real variable ("elementary")and complex variable ("analytic") methods are employed.

  5. A REVIEW ON PREDICTIVE ANALYTICS IN DATA MINING

    OpenAIRE

    Arumugam.S

    2016-01-01

    The data mining its main process is to collect, extract and store the valuable information and now-a-days it’s done by many enterprises actively. In advanced analytics, Predictive analytics is the one of the branch which is mainly used to make predictions about future events which are unknown. Predictive analytics which uses various techniques from machine learning, statistics, data mining, modeling, and artificial intelligence for analyzing the current data and to make predictions about futu...

  6. Analytical benchmarks for nuclear engineering applications. Case studies in neutron transport theory

    International Nuclear Information System (INIS)

    2008-01-01

    The developers of computer codes involving neutron transport theory for nuclear engineering applications seldom apply analytical benchmarking strategies to ensure the quality of their programs. A major reason for this is the lack of analytical benchmarks and their documentation in the literature. The few such benchmarks that do exist are difficult to locate, as they are scattered throughout the neutron transport and radiative transfer literature. The motivation for this benchmark compendium, therefore, is to gather several analytical benchmarks appropriate for nuclear engineering applications under one cover. We consider the following three subject areas: neutron slowing down and thermalization without spatial dependence, one-dimensional neutron transport in infinite and finite media, and multidimensional neutron transport in a half-space and an infinite medium. Each benchmark is briefly described, followed by a detailed derivation of the analytical solution representation. Finally, a demonstration of the evaluation of the solution representation includes qualified numerical benchmark results. All accompanying computer codes are suitable for the PC computational environment and can serve as educational tools for courses in nuclear engineering. While this benchmark compilation does not contain all possible benchmarks, by any means, it does include some of the most prominent ones and should serve as a valuable reference. (author)

  7. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  8. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  9. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    Science.gov (United States)

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  11. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    Science.gov (United States)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  12. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  13. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  14. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  15. Big Data Analytics for Industrial Process Control

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schioler, Henrik; Kulahci, Murat

    2017-01-01

    Today, in modern factories, each step in manufacturing produces a bulk of valuable as well as highly precise information. This provides a great opportunity for understanding the hidden statistical dependencies in the process. Systematic analysis and utilization of advanced analytical methods can...

  16. Business intelligence guidebook from data integration to analytics

    CERN Document Server

    Sherman, Rick

    2015-01-01

    Between the high-level concepts of business intelligence and the nitty-gritty instructions for using vendors’ tools lies the essential, yet poorly-understood layer of architecture, design and process. Without this knowledge, Big Data is belittled – projects flounder, are late and go over budget. Business Intelligence Guidebook: From Data Integration to Analytics shines a bright light on an often neglected topic, arming you with the knowledge you need to design rock-solid business intelligence and data integration processes. Practicing consultant and adjunct BI professor Rick Sherman takes the guesswork out of creating systems that are cost-effective, reusable and essential for transforming raw data into valuable information for business decision-makers. After reading this book, you will be able to design the overall architecture for functioning business intelligence systems with the supporting data warehousing and data-integration applications. You will have the information you need to get a project laun...

  17. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    Energy Technology Data Exchange (ETDEWEB)

    Ragan, Eric D [ORNL; Goodall, John R [ORNL

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit process recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.

  18. Challenging and valuable

    NARCIS (Netherlands)

    Van Hal, J.D.M.

    2008-01-01

    Challenging and valuable Inaugural speech given on May 7th 2008 at the occasion of the acceptance of the position of Professor Sustainable Housing Transformation at the faculty of Architeeture of the Delft University of Technology by Prof. J.D.M. van Hal MSc PhD.

  19. Kalisphera: an analytical tool to reproduce the partial volume effect of spheres imaged in 3D

    International Nuclear Information System (INIS)

    Tengattini, Alessandro; Andò, Edward

    2015-01-01

    In experimental mechanics, where 3D imaging is having a profound effect, spheres are commonly adopted for their simplicity and for the ease of their modeling. In this contribution we develop an analytical tool, ‘kalisphera’, to produce 3D raster images of spheres including their partial volume effect. This allows us to evaluate the metrological performance of existing image-based measurement techniques (knowing a priori the ground truth). An advanced application of ‘kalisphera’ is developed here to identify and accurately characterize spheres in real 3D x-ray tomography images with the objective of improving trinarization and contact detection. The effect of the common experimental imperfections is assessed and the overall performance of the tool tested on real images. (paper)

  20. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    Science.gov (United States)

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  1. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  2. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Björklund, Anna

    2012-01-01

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  3. Quality Indicators for Learning Analytics

    Science.gov (United States)

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  4. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  5. Recovery and utilization of valuable metals from spent nuclear fuel. 3: Mutual separation of valuable metals

    International Nuclear Information System (INIS)

    Kirishima, K.; Shibayama, H.; Nakahira, H.; Shimauchi, H.; Myochin, M.; Wada, Y.; Kawase, K.; Kishimoto, Y.

    1993-01-01

    In the project ''Recovery and Utilization of Valuable Metals from Spent Fuel,'' mutual separation process of valuable metals recovered from spent fuel has been studied by using the simulated solution contained Pb, Ru, Rh, Pd and Mo. Pd was separated successfully by DHS (di-hexyl sulfide) solvent extraction method, while Pb was recovered selectively from the raffinate by neutralization precipitation of other elements. On the other hand, Rh was roughly separated by washing the precipitate with alkaline solution, so that Rh was refined by chelate resin CS-346. Outline of the mutual separation process flow sheet has been established of the combination of these techniques. The experimental results and the process flow sheet of mutual separation of valuable metals are presented in this paper

  6. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  7. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  8. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    Science.gov (United States)

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  9. Deriving Earth Science Data Analytics Requirements

    Science.gov (United States)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  10. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  11. NC CATCH: Advancing Public Health Analytics.

    Science.gov (United States)

    Studnicki, James; Fisher, John W; Eichelberger, Christopher; Bridger, Colleen; Angelon-Gaetz, Kim; Nelson, Debi

    2010-01-01

    The North Carolina Comprehensive Assessment for Tracking Community Health (NC CATCH) is a Web-based analytical system deployed to local public health units and their community partners. The system has the following characteristics: flexible, powerful online analytic processing (OLAP) interface; multiple sources of multidimensional, event-level data fully conformed to common definitions in a data warehouse structure; enabled utilization of available decision support software tools; analytic capabilities distributed and optimized locally with centralized technical infrastructure; two levels of access differentiated by the user (anonymous versus registered) and by the analytical flexibility (Community Profile versus Design Phase); and, an emphasis on user training and feedback. The ability of local public health units to engage in outcomes-based performance measurement will be influenced by continuing access to event-level data, developments in evidence-based practice for improving population health, and the application of information technology-based analytic tools and methods.

  12. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    Science.gov (United States)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  13. Umbilical artery doppler velocimetry: a valuable tool for antenatal fetal surveillance

    International Nuclear Information System (INIS)

    Khawar, N.; Umber, A.

    2013-01-01

    To determine umbilical artery Doppler velocity parameter systolic: diastolic ratio (S/D ratio) relation with fetal well being and outcome. Setting: Department of Obstetrics and Gynecology, Lady Willingdon Hospital, Lahore Duration of study: Six months from 27-02-2008 to 26-08-2008. Subjects and methods: Sixty patients fulfilling the inclusion criteria were included in this study. They were subdivided into two groups. Group 'A' included 30 normal pregnant women with no medical or obstetrical risk factors and group 'B' included 30 pregnant women having risk factors like, hypertension, diabetes, Rhesus incompatibility, discordant twins, intrauterine growth restriction and non immunehydropsfetalis. Results: In comparison of S/D ratio with risk factors it was observed that S/D ratio 3 was present in 19 patients (31.6%) in pregnancy with hypertension/preeclampsia, 3 patients (5%) with diabetes mellitus, 11 patients (18.3%) with intrauterine growth restriction, 15 patients (25.0%) with oligohydramnios and only 1 patient (1.6%) with twin pregnancy. It was observed that women with S/D ratio 3 S/D ratio delivered 10 neonates (16.6%) with <4 Apgar score at 1 minute, 23 (38.3%) with <6 score at 5 minutes and 23 neonates (38.3%) needed resuscitation, 21 (35.0%) were admitted to neonatal unit for asphyxia. Conclusion: Umbilical artery Doppler studies is an integral tool while evaluating health of high risk pregnancies. However, it is not appropriate as a screening tool for low risk pregnancies. (author)

  14. English Digital Dictionaries as Valuable Blended Learning Tools for Palestinian College Students

    Science.gov (United States)

    Dwaik, Raghad A. A.

    2015-01-01

    Digital technology has become an indispensable aspect of foreign language learning around the globe especially in the case of college students who are often required to finish extensive reading assignments within a limited time period. Such pressure calls for the use of efficient tools such as digital dictionaries to help them achieve their…

  15. Experimental anti-GBM nephritis as an analytical tool for studying spontaneous lupus nephritis.

    Science.gov (United States)

    Du, Yong; Fu, Yuyang; Mohan, Chandra

    2008-01-01

    Systemic lupus erythematosus (SLE) is an autoimmune disease that results in immune-mediated damage to multiple organs. Among these, kidney involvement is the most common and fatal. Spontaneous lupus nephritis (SLN) in mouse models has provided valuable insights into the underlying mechanisms of human lupus nephritis. However, SLN in mouse models takes 6-12 months to manifest; hence there is clearly the need for a mouse model that can be used to unveil the pathogenic processes that lead to immune nephritis over a shorter time frame. In this article more than 25 different molecules are reviewed that have been studied both in the anti-glomerular basement membrane (anti-GBM) model and in SLN and it was found that these molecules influence both diseases in a parallel fashion, suggesting that the two disease settings share common molecular mechanisms. Based on these observations, the authors believe the experimental anti-GBM disease model might be one of the best tools currently available for uncovering the downstream molecular mechanisms leading to SLN.

  16. The program success story: a valuable tool for program evaluation.

    Science.gov (United States)

    Lavinghouze, Rene; Price, Ann Webb; Smith, Kisha-Ann

    2007-10-01

    Success stories are evaluation tools that have been used by professionals across disciplines for quite some time. They are also proving to be useful in promoting health programs and their accomplishments. The increasing popularity of success stories is due to the innovative and effective way that they increase a program's visibility, while engaging potential participants, partners, and funders in public health efforts. From the community level to the federal level, program administrators are using success stories as vehicles for celebrating achievements, sharing challenges, and communicating lessons learned. Success stories are an effective means to move beyond the numbers and connect to readers-with a cause they can relate to and want to join. This article defines success stories and provides an overview of several types of story formats, how success stories can be systematically collected, and how they are used to communicate program success.

  17. Permanent foresty plots: a potentially valuable teaching resource in undergraduate biology porgrams for the Caribbean

    Science.gov (United States)

    H. Valles; C.M.S. Carrington

    2016-01-01

    There has been a recent proposal to change the way that biology is taught and learned in undergraduate biology programs in the USA so that students develop a better understanding of science and the natural world. Here, we use this new, recommended teaching– learning framework to assert that permanent forestry plots could be a valuable tool to help develop biology...

  18. Process analytical technology (PAT) for biopharmaceuticals

    DEFF Research Database (Denmark)

    Glassey, Jarka; Gernaey, Krist; Clemens, Christoph

    2011-01-01

    Process analytical technology (PAT), the regulatory initiative for building in quality to pharmaceutical manufacturing, has a great potential for improving biopharmaceutical production. The recommended analytical tools for building in quality, multivariate data analysis, mechanistic modeling, novel...

  19. Are consumer surveys valuable as a service improvement tool in health services? A critical appraisal.

    Science.gov (United States)

    Patwardhan, Anjali; Patwardhan, Prakash

    2009-01-01

    In the recent climate of consumerism and consumer focused care, health and social care needs to be more responsive than ever before. Consumer needs and preferences can be elicited with accepted validity and reliability only by strict methodological control, customerisation of the questionnaire and skilled interpretation. To construct, conduct, interpret and implement improved service provision, requires a trained work force and infrastructure. This article aims to appraise various aspects of consumer surveys and to assess their value as effective service improvement tools. The customer is the sole reason organisations exist. Consumer surveys are used worldwide as service and quality of care improvement tools by all types of service providers including health service providers. The article critically appraises the value of consumer surveys as service improvement tools in health services tool and its future applications. No one type of survey is the best or ideal. The key is the selection of the correct survey methodology, unique and customised for the particular type/aspect of care being evaluated. The method used should reflect the importance of the information required. Methodological rigor is essential for the effectiveness of consumer surveys as service improvement tools. Unfortunately so far there is no universal consensus on superiority of one particular methodology over another or any benefit of one specific methodology in a given situation. More training and some dedicated resource allocation is required to develop consumer surveys. More research is needed to develop specific survey methodology and evaluation techniques for improved validity and reliability of the surveys as service improvement tools. Measurement of consumer preferences/priorities, evaluation of services and key performance scores, is not easy. Consumer surveys seem impressive tools as they provide the customer a voice for change or modification. However, from a scientific point

  20. Laser-induced plasma spectrometry: truly a surface analytical tool

    International Nuclear Information System (INIS)

    Vadillo, Jose M.; Laserna, J.

    2004-01-01

    For a long period, analytical applications of laser induced plasma spectrometry (LIPS) have been mainly restricted to overall and quantitative determination of elemental composition in bulk, solid samples. However, introduction of new compact and reliable solid state lasers and technological development in multidimensional intensified detectors have made possible the seeking of new analytical niches for LIPS where its analytical advantages (direct sampling from any material irrespective of its conductive status without sample preparation and with sensitivity adequate for many elements in different matrices) could be fully exploited. In this sense, the field of surface analysis could take advantage from the cited advantages taking into account in addition, the capability of LIPS for spot analysis, line scan, depth-profiling, area analysis and compositional mapping with a single instrument in air at atmospheric pressure. This review paper outlines the fundamental principles of laser-induced plasma emission relevant to sample surface studies, discusses the experimental parameters governing the spatial (lateral and in-depth) resolution in LIPS analysis and presents the applications concerning surface examination

  1. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  2. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  3. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    Science.gov (United States)

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  4. Data Analytics in CRM Processes: A Literature Review

    Directory of Open Access Journals (Sweden)

    Gončarovs Pāvels

    2017-12-01

    Full Text Available Nowadays, the data scarcity problem has been supplanted by the data deluge problem. Marketers and Customer Relationship Management (CRM specialists have access to rich data on consumer behaviour. The current challenge is effective utilisation of these data in CRM processes and selection of appropriate data analytics techniques. Data analytics techniques help find hidden patterns in data. The present paper explores the characteristics of data analytics as the integrated tool in CRM for sales managers. The paper aims at analysing some of the different analytics methods and tools which can be used for continuous improvement of CRM processes. A systematic literature has been conducted to achieve this goal. The results of the review highlight the most frequently considered CRM processes in the context of data analytics.

  5. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  6. Big Data Analytics Platforms analyze from startups to traditional database players

    Directory of Open Access Journals (Sweden)

    Ionut TARANU

    2015-07-01

    Full Text Available Big data analytics enables organizations to analyze a mix of structured, semi-structured and unstructured data in search of valuable business information and insights. The analytical findings can lead to more effective marketing, new revenue opportunities, better customer service, improved operational efficiency, competitive advantages over rival organizations and other business benefits. With so many emerging trends around big data and analytics, IT organizations need to create conditions that will allow analysts and data scientists to experiment. "You need a way to evaluate, prototype and eventually integrate some of these technologies into the business," says Chris Curran[1]. In this paper we are going to review 10 Top Big Data Analytics Platforms and compare the key-features.

  7. Evaluation of Four Different Analytical Tools to Determine the Regional Origin of Gastrodia elata and Rehmannia glutinosa on the Basis of Metabolomics Study

    Directory of Open Access Journals (Sweden)

    Dong-Kyu Lee

    2014-05-01

    Full Text Available Chemical profiles of medicinal plants could be dissimilar depending on the cultivation environments, which may influence their therapeutic efficacy. Accordingly, the regional origin of the medicinal plants should be authenticated for correct evaluation of their medicinal and market values. Metabolomics has been found very useful for discriminating the origin of many plants. Choosing the adequate analytical tool can be an essential procedure because different chemical profiles with different detection ranges will be produced according to the choice. In this study, four analytical tools, Fourier transform near‑infrared spectroscopy (FT-NIR, 1H-nuclear magnetic resonance spectroscopy (1H‑NMR, liquid chromatography-mass spectrometry (LC-MS, and gas chromatography-mass spectroscopy (GC-MS were applied in parallel to the same samples of two popular medicinal plants (Gastrodia elata and Rehmannia glutinosa cultivated either in Korea or China. The classification abilities of four discriminant models for each plant were evaluated based on the misclassification rate and Q2 obtained from principal component analysis (PCA and orthogonal projection to latent structures-discriminant analysis (OPLS‑DA, respectively. 1H-NMR and LC-MS, which were the best techniques for G. elata and R. glutinosa, respectively, were generally preferable for origin discrimination over the others. Reasoned by integrating all the results, 1H-NMR is the most prominent technique for discriminating the origins of two plants. Nonetheless, this study suggests that preliminary screening is essential to determine the most suitable analytical tool and statistical method, which will ensure the dependability of metabolomics-based discrimination.

  8. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  9. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    Science.gov (United States)

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  10. Vulnerability of particularly valuable areas. Summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-01

    This report is part of the scientific basis for the management plan for the North Sea and Skagerrak. The report focuses on the vulnerability of particularly valuable areas to petroleum activities, maritime transport, fisheries, land-based and coastal activities and long-range transboundary pollution. A working group with representatives from many different government agencies, headed by the Institute of Marine Research and the Directorate for Nature Management, has been responsible for drawing up the present report on behalf of the Expert Group for the North Sea and Skagerrak. The present report considers the 12 areas that were identified as particularly valuable during an earlier stage of the management plan process on the environment, natural resources and pollution. There are nine areas along the coast and three open sea areas in the North Sea that were identified according to the same predefined criteria as used for the management plans for the Barents Sea: Lofoten area and the Norwegian Sea. The most important criteria for particularly valuable areas are importance for biological production and importance for biodiversity.(Author)

  11. Vulnerability of particularly valuable areas. Summary

    International Nuclear Information System (INIS)

    2012-01-01

    This report is part of the scientific basis for the management plan for the North Sea and Skagerrak. The report focuses on the vulnerability of particularly valuable areas to petroleum activities, maritime transport, fisheries, land-based and coastal activities and long-range transboundary pollution. A working group with representatives from many different government agencies, headed by the Institute of Marine Research and the Directorate for Nature Management, has been responsible for drawing up the present report on behalf of the Expert Group for the North Sea and Skagerrak. The present report considers the 12 areas that were identified as particularly valuable during an earlier stage of the management plan process on the environment, natural resources and pollution. There are nine areas along the coast and three open sea areas in the North Sea that were identified according to the same predefined criteria as used for the management plans for the Barents Sea: Lofoten area and the Norwegian Sea. The most important criteria for particularly valuable areas are importance for biological production and importance for biodiversity.(Author)

  12. Microgenetic Learning Analytics Methods: Workshop Report

    Science.gov (United States)

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  13. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    Science.gov (United States)

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  14. Mapping healthcare systems: a policy relevant analytic tool.

    Science.gov (United States)

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  15. Predictive analytics can support the ACO model.

    Science.gov (United States)

    Bradley, Paul

    2012-04-01

    Predictive analytics can be used to rapidly spot hard-to-identify opportunities to better manage care--a key tool in accountable care. When considering analytics models, healthcare providers should: Make value-based care a priority and act on information from analytics models. Create a road map that includes achievable steps, rather than major endeavors. Set long-term expectations and recognize that the effectiveness of an analytics program takes time, unlike revenue cycle initiatives that may show a quick return.

  16. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    Science.gov (United States)

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Advanced web metrics with Google Analytics

    CERN Document Server

    Clifton, Brian

    2012-01-01

    Get the latest information about using the #1 web analytics tool from this fully updated guide Google Analytics is the free tool used by millions of web site owners to assess the effectiveness of their efforts. Its revised interface and new features will offer even more ways to increase the value of your web site, and this book will teach you how to use each one to best advantage. Featuring new content based on reader and client requests, the book helps you implement new methods and concepts, track social and mobile visitors, use the new multichannel funnel reporting features, understand which

  18. Big Data Analytics for Industrial Process Control

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schioler, Henrik; Kulahci, Murat

    2017-01-01

    Today, in modern factories, each step in manufacturing produces a bulk of valuable as well as highly precise information. This provides a great opportunity for understanding the hidden statistical dependencies in the process. Systematic analysis and utilization of advanced analytical methods can ...... lead towards more informed decisions. In this article we discuss some of the challenges related to big data analysis in manufacturing and relevant solutions to some of these challenges....

  19. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    Science.gov (United States)

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  20. BIG DATA ANALYTICS USE IN CUSTOMER RELATIONSHIP MANAGEMENT: ANTECEDENTS AND PERFORMANCE IMPLICATIONS

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas

    2016-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data analytics use (BD use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study aims to (1) determine whether organizational BD use improves customer-centric and financial outcomes, and (2) identify the factors influencing BD use. Drawing primarily from market...

  1. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    Science.gov (United States)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  2. Automation of analytical processes. A tool for higher efficiency and safety

    International Nuclear Information System (INIS)

    Groll, P.

    1976-01-01

    The analytical laboratory of a radiochemical facility is usually faced with the fact that numerous analyses of a similar type must be routinely carried out. Automation of such routine analytical procedures helps in increasing the efficiency and safety of the work. A review of the requirements for automation and its advantages is given and demonstrated on three examples. (author)

  3. Electrochemical sensors: a powerful tool in analytical chemistry

    Directory of Open Access Journals (Sweden)

    Stradiotto Nelson R.

    2003-01-01

    Full Text Available Potentiometric, amperometric and conductometric electrochemical sensors have found a number of interesting applications in the areas of environmental, industrial, and clinical analyses. This review presents a general overview of the three main types of electrochemical sensors, describing fundamental aspects, developments and their contribution to the area of analytical chemistry, relating relevant aspects of the development of electrochemical sensors in Brazil.

  4. Three-dimensional analytical field calculation of pyramidal-frustum shaped permanent magnets

    NARCIS (Netherlands)

    Janssen, J.L.G.; Paulides, J.J.H.; Lomonova, E.

    2009-01-01

    This paper presents a novel method to obtain fully analytical expressions of the magnetic field created by a pyramidal-frustum shaped permanent magnet. Conventional analytical tools only provide expressions for cuboidal permanent magnets and this paper extends these tools to more complex shapes. A

  5. Particle induced X-ray emission: a valuable tool for the analysis of metalpoint drawings

    International Nuclear Information System (INIS)

    Duval, A.; Guicharnaud, H.; Dran, J.C.

    2004-01-01

    For several years, we carry out a research on metalpoint drawings, a graphic technique mainly employed by European artists during the 15th and 16th centuries. As a non-destructive and very sensitive analytical technique is required, particle induced X-ray emission (PIXE) analysis with an external beam has been used for this purpose. More than 70 artworks drawn by Italian, Flemish and German artists have been analysed, including leadpoint and silverpoint drawings. Following a short description of the metalpoint technique, the results are compared with the recipes written by Cennino Cennini at the beginning of the 15th century and specific examples are presented

  6. Combined measurement of plasma cystatin C and low-density lipoprotein cholesterol: A valuable tool for evaluating progressive supranuclear palsy.

    Science.gov (United States)

    Weng, Ruihui; Wei, Xiaobo; Yu, Bin; Zhu, Shuzhen; Yang, Xiaohua; Xie, Fen; Zhang, Mahui; Jiang, Ying; Feng, Zhong-Ping; Sun, Hong-Shuo; Xia, Ying; Jin, Kunlin; Chan, Piu; Wang, Qing; Gao, Xiaoya

    2018-07-01

    Progressive supranuclear palsy (PSP) was previously thought as a cause of atypical Parkinsonism. Although Cystatin C (Cys C) and low-density cholesterol lipoprotein-C (LDL-C) are known to play critical roles in Parkinsonism, it is unknown whether they can be used as markers to distinguish PSP patients from healthy subjects and to determine disease severity. We conducted a cross-sectional study to determine plasma Cys C/HDL/LDL-C levels of 40 patients with PSP and 40 healthy age-matched controls. An extended battery of motor and neuropsychological tests, including the PSP-Rating Scale (PSPRS), the Non-Motor Symptoms Scale (NMSS), Geriatric Depression Scale (GDS) and Mini-Mental State Examination (MMSE), was used to evaluate the disease severity. Receiver operating characteristic (ROC) curves were adopted to assess the prognostic accuracy of Cys C/LDL-C levels in distinguishing PSP from healthy subjects. Patients with PSP exhibited significantly higher plasma levels of Cys C and lower LDL-C. The levels of plasma Cys C were positively and inversely correlated with the PSPRS/NMSS and MMSE scores, respectively. The LDL-C/HDL-C ratio was positively associated with PSPRS/NMSS and GDS scores. The ROC curve for the combination of Cys C and LDL-C yielded a better accuracy for distinguishing PSP from healthy subjects than the separate curves for each parameter. Plasma Cys C and LDL-C may be valuable screening tools for differentiating PSP from healthy subjects; while they could be useful for the PSP intensifies and severity evaluation. A better understanding of Cys C and LDL-C may yield insights into the pathogenesis of PSP. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Analytic number theory

    CERN Document Server

    Iwaniec, Henryk

    2004-01-01

    Analytic Number Theory distinguishes itself by the variety of tools it uses to establish results, many of which belong to the mainstream of arithmetic. One of the main attractions of analytic number theory is the vast diversity of concepts and methods it includes. The main goal of the book is to show the scope of the theory, both in classical and modern directions, and to exhibit its wealth and prospects, its beautiful theorems and powerful techniques. The book is written with graduate students in mind, and the authors tried to balance between clarity, completeness, and generality. The exercis

  8. Recovering valuable shale oils, etc

    Energy Technology Data Exchange (ETDEWEB)

    Engler, C

    1922-09-26

    A process is described for the recovery of valuable shale oils or tars, characterized in that the oil shale is heated to about 300/sup 0/C or a temperature not exceeding this essentially and then is treated with a solvent with utilization of this heat.

  9. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    Science.gov (United States)

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., regimes.

  10. Quality system implementation for nuclear analytical techniques

    International Nuclear Information System (INIS)

    2004-01-01

    The international effort (UNIDO, ILAC, BIPM, etc.) to establish a functional infrastructure for metrology and accreditation in many developing countries needs to be complemented by assistance to implement high quality practices and high quality output by service providers and producers in the respective countries. Knowledge of how to approach QA systems that justify a formal accreditation is available in only a few countries and the dissemination of know how and development of skills is needed bottom up from the working level of laboratories and institutes. Awareness building, convincing of management, introduction of good management practices, technical expertise and good documentation will lead to the creation of a quality culture that assures a sustainability and inherent development of quality practices as a prerequisite of economic success. Quality assurance and quality control can be used as a valuable management tool and is a prerequisite for international trade and information exchange. This publication tries to assist quality managers, Laboratory Managers and staff involved in setting up a QA/QC system in a nuclear analytical laboratory to take appropriate action to start and complete the necessary steps for a successful quality system for ultimate national accreditation. This guidebook contributes to a better understanding of the basic ideas behind ISO/IEC 17025, the international standard for 'General requirements for the competence of testing and calibration laboratories'. It provides basic information and detailed explanation about the establishment of the QC system in analytical and nuclear analytical laboratories. It is a proper training material for training of trainers and makes managers with QC management and implementation familiar. This training material aims to facilitate the implementation of internationally accepted quality principles and to promote attempts by Member States' laboratories to obtain accreditation for nuclear analytical

  11. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  12. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  13. Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    OpenAIRE

    Guidi, Gabriele; Miniati, Roberto; Mazzola, Matteo; Iadanza, Ernesto

    2016-01-01

    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detect...

  14. IBM’s Health Analytics and Clinical Decision Support

    Science.gov (United States)

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  15. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  16. Competing on analytics.

    Science.gov (United States)

    Davenport, Thomas H

    2006-01-01

    We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.

  17. License to evaluate: Preparing learning analytics dashboards for educational practice

    NARCIS (Netherlands)

    Jivet, Ioana; Scheffel, Maren; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    Learning analytics can bridge the gap between learning sciences and data analytics, leveraging the expertise of both fields in exploring the vast amount of data generated in online learning environments. A typical learning analytics intervention is the learning dashboard, a visualisation tool built

  18. USE OF BIG DATA ANALYTICS FOR CUSTOMER RELATIONSHIP MANAGEMENT: POINT OF PARITY OR SOURCE OF COMPETITIVE ADVANTAGE?

    OpenAIRE

    Suoniemi, Samppa; Meyer-Waarden, Lars; Munzel, Andreas; Zablah, Alex R.; Straub, Detmar W.

    2017-01-01

    Customer information plays a key role in managing successful relationships with valuable customers. Big data customer analytics use (CA use), i.e., the extent to which customer information derived from big data analytics guides marketing decisions, helps firms better meet customer needs for competitive advantage. This study addresses three research questions: 1. What are the key antecedents of big data customer analytics use? 2. How, and to what extent, does big data...

  19. Ootw Tool Requirements in Relation to JWARS

    Energy Technology Data Exchange (ETDEWEB)

    Hartley III, D.S.; Packard, S.L.

    1998-01-01

    This document reports the results of the CMke of the Secretary of Defense/Program Analysis & Evaluation (OSD/PA&E) sponsored project to identify how Operations Other Than War (OOTW) tool requirements relate to the Joint Warfare Simulation (JWARS) and, more generally, to joint analytical modeling and simulation (M&S) requirements. It includes recommendations about which OOTW tools (and functionality within tools) should be included in JWARS, which should be managed as joint analytical modeling and simulation (M&S) tools, and which should be left for independent development.

  20. Demonstrating Success: Web Analytics and Continuous Improvement

    Science.gov (United States)

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  1. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  2. Search Analytics for Your Site

    CERN Document Server

    Rosenfeld, Louis

    2011-01-01

    Any organization that has a searchable web site or intranet is sitting on top of hugely valuable and usually under-exploited data: logs that capture what users are searching for, how often each query was searched, and how many results each query retrieved. Search queries are gold: they are real data that show us exactly what users are searching for in their own words. This book shows you how to use search analytics to carry on a conversation with your customers: listen to and understand their needs, and improve your content, navigation and search performance to meet those needs.

  3. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  4. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  5. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    Science.gov (United States)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  6. PRIDE and "Database on Demand" as valuable tools for computational proteomics.

    Science.gov (United States)

    Vizcaíno, Juan Antonio; Reisinger, Florian; Côté, Richard; Martens, Lennart

    2011-01-01

    The Proteomics Identifications Database (PRIDE, http://www.ebi.ac.uk/pride ) provides users with the ability to explore and compare mass spectrometry-based proteomics experiments that reveal details of the protein expression found in a broad range of taxonomic groups, tissues, and disease states. A PRIDE experiment typically includes identifications of proteins, peptides, and protein modifications. Additionally, many of the submitted experiments also include the mass spectra that provide the evidence for these identifications. Finally, one of the strongest advantages of PRIDE in comparison with other proteomics repositories is the amount of metadata it contains, a key point to put the above-mentioned data in biological and/or technical context. Several informatics tools have been developed in support of the PRIDE database. The most recent one is called "Database on Demand" (DoD), which allows custom sequence databases to be built in order to optimize the results from search engines. We describe the use of DoD in this chapter. Additionally, in order to show the potential of PRIDE as a source for data mining, we also explore complex queries using federated BioMart queries to integrate PRIDE data with other resources, such as Ensembl, Reactome, or UniProt.

  7. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    Science.gov (United States)

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  8. Explanatory analytics in OLAP

    NARCIS (Netherlands)

    Caron, E.A.M.; Daniëls, H.A.M.

    2013-01-01

    In this paper the authors describe a method to integrate explanatory business analytics in OLAP information systems. This method supports the discovery of exceptional values in OLAP data and the explanation of such values by giving their underlying causes. OLAP applications offer a support tool for

  9. Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.

    Science.gov (United States)

    Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert

    2017-07-01

    The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Nuclear analytical methods: Past, present and future

    International Nuclear Information System (INIS)

    Becker, D.A.

    1996-01-01

    The development of nuclear analytical methods as an analytical tool began in 1936 with the publication of the first paper on neutron activation analysis (NAA). This year, 1996, marks the 60th anniversary of that event. This paper attempts to look back at the nuclear analytical methods of the past, to look around and to see where the technology is right now, and finally, to look ahead to try and see where nuclear methods as an analytical technique (or as a group of analytical techniques) will be going in the future. The general areas which the author focuses on are: neutron activation analysis; prompt gamma neutron activation analysis (PGNAA); photon activation analysis (PAA); charged-particle activation analysis (CPAA)

  11. Semantic Document Image Classification Based on Valuable Text Pattern

    Directory of Open Access Journals (Sweden)

    Hossein Pourghassem

    2011-01-01

    Full Text Available Knowledge extraction from detected document image is a complex problem in the field of information technology. This problem becomes more intricate when we know, a negligible percentage of the detected document images are valuable. In this paper, a segmentation-based classification algorithm is used to analysis the document image. In this algorithm, using a two-stage segmentation approach, regions of the image are detected, and then classified to document and non-document (pure region regions in the hierarchical classification. In this paper, a novel valuable definition is proposed to classify document image in to valuable or invaluable categories. The proposed algorithm is evaluated on a database consisting of the document and non-document image that provide from Internet. Experimental results show the efficiency of the proposed algorithm in the semantic document image classification. The proposed algorithm provides accuracy rate of 98.8% for valuable and invaluable document image classification problem.

  12. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    Science.gov (United States)

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  13. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    Science.gov (United States)

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of 0.05).

  14. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  15. Valuable Internet Advertising and Customer Satisfaction Cycle(VIACSC)

    OpenAIRE

    Muhammad Awais; Tanzila Samin; Muhammad Bilal

    2012-01-01

    Now-a-days it is very important for the business persons to attract their target customers towards their products through valuable mode of promotion and communication. Increasing use of World Wide Web has completely changed the scenario of business sector. Customized products and services, customers preferences, @ and dot com craze have elevated the importance of internet advertising. This research paper investigates valuable internet advertising which will help to enhance the value of intern...

  16. Learning analytics fundaments, applications, and trends : a view of the current state of the art to enhance e-learning

    CERN Document Server

    2017-01-01

    This book provides a conceptual and empirical perspective on learning analytics, its goal being to disseminate the core concepts, research, and outcomes of this emergent field. Divided into nine chapters, it offers reviews oriented on selected topics, recent advances, and innovative applications. It presents the broad learning analytics landscape and in-depth studies on higher education, adaptive assessment, teaching and learning. In addition, it discusses valuable approaches to coping with personalization and huge data, as well as conceptual topics and specialized applications that have shaped the current state of the art. By identifying fundamentals, highlighting applications, and pointing out current trends, the book offers an essential overview of learning analytics to enhance learning achievement in diverse educational settings. As such, it represents a valuable resource for researchers, practitioners, and students interested in updating their knowledge and finding inspirations for their future work.

  17. Analytic plane wave solutions for the quaternionic potential step

    International Nuclear Information System (INIS)

    De Leo, Stefano; Ducati, Gisele C.; Madureira, Tiago M.

    2006-01-01

    By using the recent mathematical tools developed in quaternionic differential operator theory, we solve the Schroedinger equation in the presence of a quaternionic step potential. The analytic solution for the stationary states allows one to explicitly show the qualitative and quantitative differences between this quaternionic quantum dynamical system and its complex counterpart. A brief discussion on reflected and transmitted times, performed by using the stationary phase method, and its implication on the experimental evidence for deviations of standard quantum mechanics is also presented. The analytic solution given in this paper represents a fundamental mathematical tool to find an analytic approximation to the quaternionic barrier problem (up to now solved by numerical method)

  18. Molecular data and radiative transfer tools for ALMA

    NARCIS (Netherlands)

    Tak, F F S van der; ; Hogerheijde, M

    2007-01-01

    Abstract: This paper presents an overview of several modeling tools for analyzing molecular line observations at submillimeter wavelengths. These tools are already proving to be very valuable for the interpretation of data from current telescopes, and will be indispensable for data obtained with

  19. EnergiTools(R) - a power plant performance monitoring and diagnosis tool

    International Nuclear Information System (INIS)

    Ancion, P.V.; Bastien, R.; Ringdahl, K.

    2000-01-01

    Westinghouse EnergiTools(R) is a performance diagnostic tool for power generation plants that combines the power of on-line process data acquisition with advanced diagnostics methodologies. The system uses analytical models based on thermodynamic principles combined with knowledge of component diagnostic experts. An issue in modeling expert knowledge is to have a framework that can represent and process uncertainty in complex systems. In such experiments, it is nearly impossible to build deterministic models for the effects of faults on symptoms. A methodology based on causal probabilistic graphs, more specifically on Bayesian belief networks, has been implemented in EnergiTools(R) to capture the fault-symptom relationships. The methodology estimates the likelihood of the various component failures using the fault-symptom relationships. The system also has the ability to use neural networks for processes that are difficult to model analytically. An application is the estimation of the reactor power in nuclear power plant by interpreting several plant indicators. EnergiTools(R) is used for the on-line performance monitoring and diagnostics at Vattenfall Ringhals nuclear power plants in Sweden. It has led to the diagnosis of various performance issues with plant components. Two case studies are presented. In the first case, an overestimate of the thermal power due to a faulty instrument was found, which led to a plant operation below its optimal power. The paper shows how the problem was discovered, using the analytical thermodynamic calculations. The second case shows an application of EnergiTools(R) for the diagnostic of a condenser failure using causal probabilistic graphs

  20. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    International Nuclear Information System (INIS)

    Hervas, Miriam; Lopez, Miguel Angel; Escarpa, Alberto

    2009-01-01

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 μg L -1 and EC 50 0.079 μg L -1 were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  1. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    Science.gov (United States)

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  2. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  3. ANALYTICAL ANARCHISM: THE PROBLEM OF DEFINITION AND DEMARCATION

    OpenAIRE

    Konstantinov M.S.

    2012-01-01

    In this paper the first time in the science of our country is considered a new trend of anarchist thought - analytical anarchism. As a methodological tool used critical analysis of the key propositions of the basic versions of this trend: the anarcho- capitalist and egalitarian. The study was proposed classification of discernible trends within the analytical anarchism on the basis of value criteria, identified conceptual and methodological problems of definition analytical anarchism and its ...

  4. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  5. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  6. Facebook: A Potentially Valuable Educational Tool?

    Science.gov (United States)

    Voivonta, Theodora; Avraamidou, Lucy

    2018-01-01

    This paper is concerned with the educational value of Facebook and specifically how it can be used in formal educational settings. As such, it provides a review of existing literature of how Facebook is used in higher education paying emphasis on the scope of its use and the outcomes achieved. As evident in existing literature, Facebook has been…

  7. Facebook : A potentially valuable educational tool?

    NARCIS (Netherlands)

    Voivonta, Theodora; Avraamidou, Lucy

    2018-01-01

    This paper is concerned with the educational value of Facebook and specifically how it can be used in formal educational settings. As such, it provides a review of existing literature of how Facebook is used in higher education paying emphasis on the scope of its use and the outcomes achieved. As

  8. Group Decision Making with the Analytic Hierarchy Process in Benefit-Risk Assessment: A Tutorial

    NARCIS (Netherlands)

    Hummel, J. Marjan; Bridges, John; IJzerman, Maarten Joost

    2014-01-01

    The analytic hierarchy process (AHP) has been increasingly applied as a technique for multi-criteria decision analysis in healthcare. The AHP can aid decision makers in selecting the most valuable technology for patients, while taking into account multiple, and even conflicting, decision criteria.

  9. Analytical solutions of nonlocal Poisson dielectric models with multiple point charges inside a dielectric sphere

    Science.gov (United States)

    Xie, Dexuan; Volkmer, Hans W.; Ying, Jinyong

    2016-04-01

    The nonlocal dielectric approach has led to new models and solvers for predicting electrostatics of proteins (or other biomolecules), but how to validate and compare them remains a challenge. To promote such a study, in this paper, two typical nonlocal dielectric models are revisited. Their analytical solutions are then found in the expressions of simple series for a dielectric sphere containing any number of point charges. As a special case, the analytical solution of the corresponding Poisson dielectric model is also derived in simple series, which significantly improves the well known Kirkwood's double series expansion. Furthermore, a convolution of one nonlocal dielectric solution with a commonly used nonlocal kernel function is obtained, along with the reaction parts of these local and nonlocal solutions. To turn these new series solutions into a valuable research tool, they are programed as a free fortran software package, which can input point charge data directly from a protein data bank file. Consequently, different validation tests can be quickly done on different proteins. Finally, a test example for a protein with 488 atomic charges is reported to demonstrate the differences between the local and nonlocal models as well as the importance of using the reaction parts to develop local and nonlocal dielectric solvers.

  10. Proceedings of the 11. ENQA: Brazilian meeting on analytical chemistry. Challenges for analytical chemistry in the 21st century. Book of Abstracts

    International Nuclear Information System (INIS)

    2001-01-01

    The 11th National Meeting on Analytical Chemistry was held from 18 to 21 September, 2001 at the Convention Center of UNICAMP, with the theme Challenges for Analytical Chemistry in the 21st Century. This meeting have discussed on the development of new methods and analytical tools needed to solve new challenges. The papers presented topics related to the different sub-areas of Analytical Chemistry such as Environmental Chemistry; Chemiometry techniques; X-ray Fluorescence Analysis; Spectroscopy; Separation Processes; Electroanalytic Chemistry and others. Were also included lectures on the Past and Future of Analytical Chemistry and on Ethics in Science

  11. Analytical studies related to Indian PHWR containment system performance

    International Nuclear Information System (INIS)

    Haware, S.K.; Markandeya, S.G.; Ghosh, A.K.; Kushwaha, H.S.; Venkat Raj, V.

    1998-01-01

    Build-up of pressure in a multi-compartment containment after a postulated accident, the growth, transportation and removal of aerosols in the containment are complex processes of vital importance in deciding the source term. The release of hydrogen and its combustion increases the overpressure. In order to analyze these complex processes and to enable proper estimation of the source term, well tested analytical tools are necessary. This paper gives a detailed account of the analytical tools developed/adapted for PSA level 2 studies. (author)

  12. Modeling of the Global Water Cycle - Analytical Models

    Science.gov (United States)

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  13. Urban Health Indicator Tools of the Physical Environment: a Systematic Review.

    Science.gov (United States)

    Pineo, Helen; Glonti, Ketevan; Rutter, Harry; Zimmermann, Nici; Wilkinson, Paul; Davies, Michael

    2018-04-16

    Urban health indicator (UHI) tools provide evidence about the health impacts of the physical urban environment which can be used in built environment policy and decision-making. Where UHI tools provide data at the neighborhood (and lower) scale they can provide valuable information about health inequalities and environmental deprivation. This review performs a census of UHI tools and explores their nature and characteristics (including how they represent, simplify or address complex systems) to increase understanding of their potential use by municipal built environment policy and decision-makers. We searched seven bibliographic databases, four key journals and six practitioner websites and conducted Google searches between January 27, 2016 and February 24, 2016 for UHI tools. We extracted data from primary studies and online indicator systems. We included 198 documents which identified 145 UHI tools comprising 8006 indicators, from which we developed a taxonomy. Our taxonomy classifies the significant diversity of UHI tools with respect to topic, spatial scale, format, scope and purpose. The proportions of UHI tools which measure data at the neighborhood and lower scale, and present data via interactive maps, have both increased over time. This is particularly relevant to built environment policy and decision-makers, reflects growing analytical capability and offers the potential for improved understanding of the complexity of influences on urban health (an aspect noted as a particular challenge by some indicator producers). The relation between urban health indicators and health impacts attributable to modifiable environmental characteristics is often indirect. Furthermore, the use of UHI tools in policy and decision-making appears to be limited, thus raising questions about the continued development of such tools by multiple organisations duplicating scarce resources. Further research is needed to understand the requirements of built environment policy and

  14. Sustainability Tools Inventory Initial Gap Analysis

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  15. Locally analytic vectors in representations of locally

    CERN Document Server

    Emerton, Matthew J

    2017-01-01

    The goal of this memoir is to provide the foundations for the locally analytic representation theory that is required in three of the author's other papers on this topic. In the course of writing those papers the author found it useful to adopt a particular point of view on locally analytic representation theory: namely, regarding a locally analytic representation as being the inductive limit of its subspaces of analytic vectors (of various "radii of analyticity"). The author uses the analysis of these subspaces as one of the basic tools in his study of such representations. Thus in this memoir he presents a development of locally analytic representation theory built around this point of view. The author has made a deliberate effort to keep the exposition reasonably self-contained and hopes that this will be of some benefit to the reader.

  16. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  17. Capturing student mathematical engagement through differently enacted classroom practices: applying a modification of Watson's analytical tool

    Science.gov (United States)

    Patahuddin, Sitti Maesuri; Puteri, Indira; Lowrie, Tom; Logan, Tracy; Rika, Baiq

    2018-04-01

    This study examined student mathematical engagement through the intended and enacted lessons taught by two teachers in two different middle schools in Indonesia. The intended lesson was developed using the ELPSA learning design to promote mathematical engagement. Based on the premise that students will react to the mathematical tasks in the forms of words and actions, the analysis focused on identifying the types of mathematical engagement promoted through the intended lesson and performed by students during the lesson. Using modified Watson's analytical tool (2007), students' engagement was captured from what the participants' did or said mathematically. We found that teachers' enacted practices had an influence on student mathematical engagement. The teacher who demonstrated content in explicit ways tended to limit the richness of the engagement; whereas the teacher who presented activities in an open-ended manner fostered engagement.

  18. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    Science.gov (United States)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82

  19. Recovering valuable liquid hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Pier, M

    1931-06-11

    A process for recovering valuable liquid hydrocarbons from coking coal, mineral coal, or oil shale through treatment with hydrogen under pressure at elevated temperature is described. Catalysts and grinding oil may be used in the process if necessary. The process provides for deashing the coal prior to hydrogenation and for preventing the coking and swelling of the deashed material. During the treatment with hydrogen, the coal is either mixed with coal low in bituminous material, such as lean coal or active coal, as a diluent or the bituminous constituents which cause the coking and swelling are removed by extraction with solvents. (BLM)

  20. Tool for Collaborative Autonomy, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Over the last 25 years, UAS have proven to be very valuable tools for performing a wide range of operations such as environmental disaster relief, search and rescue...

  1. Online Analytical Processing (OLAP: A Fast and Effective Data Mining Tool for Gene Expression Databases

    Directory of Open Access Journals (Sweden)

    Alkharouf Nadim W.

    2005-01-01

    Full Text Available Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD. A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  2. Online analytical processing (OLAP): a fast and effective data mining tool for gene expression databases.

    Science.gov (United States)

    Alkharouf, Nadim W; Jamison, D Curtis; Matthews, Benjamin F

    2005-06-30

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  3. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David; Wolverton, Michael J.; Bruce, Joseph R.; Burtner, Edwin R.; Endert, Alexander

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying models of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.

  4. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    Science.gov (United States)

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  5. Hemodynamic exercise testing. A valuable tool in the selection of cardiac transplantation candidates.

    Science.gov (United States)

    Chomsky, D B; Lang, C C; Rayos, G H; Shyr, Y; Yeoh, T K; Pierson, R N; Davis, S F; Wilson, J R

    1996-12-15

    Peak exercise oxygen consumption (Vo2), a noninvasive index of peak exercise cardiac output (CO), is widely used to select candidates for heart transplantation. However, peak exercise Vo2 can be influenced by noncardiac factors such as deconditioning, motivation, or body composition and may yield misleading prognostic information. Direct measurement of the CO response to exercise may avoid this problem and more accurately predict prognosis. Hemodynamic and ventilatory responses to maximal treadmill exercise were measured in 185 ambulatory patients with chronic heart failure who had been referred for cardiac transplantation (mean left ventricular ejection fraction, 22 +/- 7%; mean peak Vo2, 12.9 +/- 3.0 mL. min-1.kg-1). CO response to exercise was normal in 83 patients and reduced in 102. By univariate analysis, patients with normal CO responses had a better 1-year survival rate (95%) than did those with reduced CO responses (72%) (P 14 mL.min-1.kg-1 (88%) was not different from that of patients with peak Vo2 of 10 mL.min-1.kg-1 (89%) (P < .0001). By Cox regression analysis, exercise CO response was the strongest independent predictor of survival (risk ratio, 4.3), with peak Vo2 dichotomized at 10 mL. min-1.kg-1 (risk ratio, 3.3) as the only other independent predictor. Patients with reduced CO responses and peak Vo2 of < or = 10 mL.min-1.kg-1 had an extremely poor 1-year survival rate (38%). Both CO response to exercise and peak exercise Vo2 provide valuable independent prognostic information in ambulatory patients with heart failure. These variables should be used in combination to select potential heart transplantation candidates.

  6. An Educational Tool for Interactive Parallel and Distributed Processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2011-01-01

    In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation of the abs......In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation...... of the abstract problems related to designing interactive parallel and distributed systems. Indeed, MITS seems to bring a series of goals into the education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, user and multiuser interaction, which can hardly be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples show how to implement interactive...

  7. Ravens reconcile after aggressive conflicts with valuable partners.

    Science.gov (United States)

    Fraser, Orlaith N; Bugnyar, Thomas

    2011-03-25

    Reconciliation, a post-conflict affiliative interaction between former opponents, is an important mechanism for reducing the costs of aggressive conflict in primates and some other mammals as it may repair the opponents' relationship and reduce post-conflict distress. Opponents who share a valuable relationship are expected to be more likely to reconcile as for such partners the benefits of relationship repair should outweigh the risk of renewed aggression. In birds, however, post-conflict behavior has thus far been marked by an apparent absence of reconciliation, suggested to result either from differing avian and mammalian strategies or because birds may not share valuable relationships with partners with whom they engage in aggressive conflict. Here, we demonstrate the occurrence of reconciliation in a group of captive subadult ravens (Corvus corax) and show that it is more likely to occur after conflicts between partners who share a valuable relationship. Furthermore, former opponents were less likely to engage in renewed aggression following reconciliation, suggesting that reconciliation repairs damage caused to their relationship by the preceding conflict. Our findings suggest not only that primate-like valuable relationships exist outside the pair bond in birds, but that such partners may employ the same mechanisms in birds as in primates to ensure that the benefits afforded by their relationships are maintained even when conflicts of interest escalate into aggression. These results provide further support for a convergent evolution of social strategies in avian and mammalian species.

  8. Google Analytics – Index of Resources

    Science.gov (United States)

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  9. Applications of DHDECMP extraction chromatography to nuclear analytical chemistry

    International Nuclear Information System (INIS)

    Marsh, S.F.; Simi, O.R.

    1981-01-01

    Dihexyl-N,N-diethylcarbamylmethylenephosphonate (DHDECMP) is a highly selective extractant for actinides and lanthanides. This reagent, extensively studied for process-scale operations, also has valuable analytical applications. Extraction chromatographic columns of DHDECMP, supported on inert, porous, polymer beads effectively separate most metallic impurity elements from the retained inner transition elements. The retained elements can be separated into individual fractions of (1) lanthanides, (2) americium, (3) plutonium, and (4) uranium by mixed-solvent anion exchange

  10. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  11. Analytical Methods for Biomass Characterization during Pretreatment and Bioconversion

    Energy Technology Data Exchange (ETDEWEB)

    Pu, Yunqiao [ORNL; Meng, Xianzhi [University of Tennessee, Knoxville (UTK); Yoo, Chang Geun; Li, Mi; Ragauskas, Arthur J [ORNL

    2016-01-01

    Lignocellulosic biomass has been introduced as a promising resource for alternative fuels and chemicals because of its abundance and complement for petroleum resources. Biomass is a complex biopolymer and its compositional and structural characteristics largely vary depending on its species as well as growth environments. Because of complexity and variety of biomass, understanding its physicochemical characteristics is a key for effective biomass utilization. Characterization of biomass does not only provide critical information of biomass during pretreatment and bioconversion, but also give valuable insights on how to utilize the biomass. For better understanding biomass characteristics, good grasp and proper selection of analytical methods are necessary. This chapter introduces existing analytical approaches that are widely employed for biomass characterization during biomass pretreatment and conversion process. Diverse analytical methods using Fourier transform infrared (FTIR) spectroscopy, gel permeation chromatography (GPC), and nuclear magnetic resonance (NMR) spectroscopy for biomass characterization are reviewed. In addition, biomass accessibility methods by analyzing surface properties of biomass are also summarized in this chapter.

  12. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  13. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  14. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  15. Analyticity and the Global Information Field

    Directory of Open Access Journals (Sweden)

    Evgeni A. Solov'ev

    2015-03-01

    Full Text Available The relation between analyticity in mathematics and the concept of a global information field in physics is reviewed. Mathematics is complete in the complex plane only. In the complex plane, a very powerful tool appears—analyticity. According to this property, if an analytic function is known on the countable set of points having an accumulation point, then it is known everywhere. This mysterious property has profound consequences in quantum physics. Analyticity allows one to obtain asymptotic (approximate results in terms of some singular points in the complex plane which accumulate all necessary data on a given process. As an example, slow atomic collisions are presented, where the cross-sections of inelastic transitions are determined by branch-points of the adiabatic energy surface at a complex internuclear distance. Common aspects of the non-local nature of analyticity and a recently introduced interpretation of classical electrodynamics and quantum physics as theories of a global information field are discussed.

  16. Predictive Analytics to Support Real-Time Management in Pathology Facilities.

    Science.gov (United States)

    Lessard, Lysanne; Michalowski, Wojtek; Chen Li, Wei; Amyot, Daniel; Halwani, Fawaz; Banerjee, Diponkar

    2016-01-01

    Predictive analytics can provide valuable support to the effective management of pathology facilities. The introduction of new tests and technologies in anatomical pathology will increase the volume of specimens to be processed, as well as the complexity of pathology processes. In order for predictive analytics to address managerial challenges associated with the volume and complexity increases, it is important to pinpoint the areas where pathology managers would most benefit from predictive capabilities. We illustrate common issues in managing pathology facilities with an analysis of the surgical specimen process at the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital, which processes all surgical specimens for the Eastern Ontario Regional Laboratory Association. We then show how predictive analytics could be used to support management. Our proposed approach can be generalized beyond the DPLM, contributing to a more effective management of pathology facilities and in turn to quicker clinical diagnoses.

  17. Storytelling: a leadership and educational tool.

    Science.gov (United States)

    Kowalski, Karren

    2015-06-01

    A powerful tool that leaders and educators can use to engage the listeners-both staff and learners-is storytelling. Stories demonstrate important points, valuable lessons, and the behaviors that are preferred by the leader. Copyright 2015, SLACK Incorporated.

  18. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    Directory of Open Access Journals (Sweden)

    Miroslav Pohanka

    2015-06-01

    Full Text Available Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE. The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.. It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance.

  19. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    Science.gov (United States)

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  1. Nonlinear optics an analytical approach

    CERN Document Server

    Mandel, Paul

    2010-01-01

    Based on the author's extensive teaching experience and lecture notes, this textbook provides a substantially analytical rather than descriptive presentation of nonlinear optics. Divided into five parts, with most chapters corresponding to a two-hour lecture, the book begins with a unique account of the historical development from Kirchhoff's law for the black-body radiation to Planck's quantum hypothesis and Einstein's discovery of spontaneous emission - providing all the explicit proofs. The subsequent sections deal with matter quantization, ultrashort pulse propagation in 2-level media, cavity nonlinear optics, chi(2) and chi(3) media. For graduate and PhD students in nonlinear optics or photonics, while also representing a valuable reference for researchers in these fields.

  2. Analytical characterization using surface-enhanced Raman scattering (SERS) and microfluidic sampling

    International Nuclear Information System (INIS)

    Wang, Chao; Yu, Chenxu

    2015-01-01

    With the rapid development of analytical techniques, it has become much easier to detect chemical and biological analytes, even at very low detection limits. In recent years, techniques based on vibrational spectroscopy, such as surface enhanced Raman spectroscopy (SERS), have been developed for non-destructive detection of pathogenic microorganisms. SERS is a highly sensitive analytical tool that can be used to characterize chemical and biological analytes interacting with SERS-active substrates. However, it has always been a challenge to obtain consistent and reproducible SERS spectroscopic results at complicated experimental conditions. Microfluidics, a tool for highly precise manipulation of small volume liquid samples, can be used to overcome the major drawbacks of SERS-based techniques. High reproducibility of SERS measurement could be obtained in continuous flow generated inside microfluidic devices. This article provides a thorough review of the principles, concepts and methods of SERS-microfluidic platforms, and the applications of such platforms in trace analysis of chemical and biological analytes. (topical review)

  3. Analytical Techniques in the Pharmaceutical Sciences

    DEFF Research Database (Denmark)

    Leurs, Ulrike; Mistarz, Ulrik Hvid; Rand, Kasper Dyrberg

    2016-01-01

    Mass spectrometry (MS) offers the capability to identify, characterize and quantify a target molecule in a complex sample matrix and has developed into a premier analytical tool in drug development science. Through specific MS-based workflows including customized sample preparation, coupling...

  4. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    Massa, R.J.

    1987-01-01

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  5. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  6. Applying Pragmatics Principles for Interaction with Visual Analytics.

    Science.gov (United States)

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  7. Born analytical or adopted over time? a study investigating if new analytical tools can ensure the survival of market oriented startups.

    OpenAIRE

    Skogen, Hege Janson; De la Cruz, Kai

    2017-01-01

    Masteroppgave(MSc) in Master of Science in Strategic Marketing Management - Handelshøyskolen BI, 2017 This study investigates whether the prevalence of technological advances within quantitative analytics moderates the effect market orientation has on firm performance, and if startups can take advantage of the potential opportunities to ensure their own survival. For this purpose, the authors review previous literature in marketing orientation, startups, marketing analytics, an...

  8. Average of delta: a new quality control tool for clinical laboratories.

    Science.gov (United States)

    Jones, Graham R D

    2016-01-01

    Average of normals is a tool used to control assay performance using the average of a series of results from patients' samples. Delta checking is a process of identifying errors in individual patient results by reviewing the difference from previous results of the same patient. This paper introduces a novel alternate approach, average of delta, which combines these concepts to use the average of a number of sequential delta values to identify changes in assay performance. Models for average of delta and average of normals were developed in a spreadsheet application. The model assessed the expected scatter of average of delta and average of normals functions and the effect of assay bias for different values of analytical imprecision and within- and between-subject biological variation and the number of samples included in the calculations. The final assessment was the number of patients' samples required to identify an added bias with 90% certainty. The model demonstrated that with larger numbers of delta values, the average of delta function was tighter (lower coefficient of variation). The optimal number of samples for bias detection with average of delta was likely to be between 5 and 20 for most settings and that average of delta outperformed average of normals when the within-subject biological variation was small relative to the between-subject variation. Average of delta provides a possible additional assay quality control tool which theoretical modelling predicts may be more valuable than average of normals for analytes where the group biological variation is wide compared with within-subject variation and where there is a high rate of repeat testing in the laboratory patient population. © The Author(s) 2015.

  9. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  10. Are patient surveys valuable as a service-improvement tool in health services? An overview

    Directory of Open Access Journals (Sweden)

    Patwardhan A

    2012-05-01

    Full Text Available Anjali Patwardhan,1 Charles H Spencer21Nationwide Children’s Hospital Columbus, 2Ohio State University, Columbus, OH, USAAbstract: Improving the quality of care in international health services was made a high priority in 1977. The World Health Assembly passed a resolution to greatly improve “Health for all” by the year 2000. Since 1977, the use of patient surveys for quality improvement has become a common practice in the health-care industry. The use of surveys reflects the concept that patient satisfaction is closely linked with that of organizational performance, which is in turn closely linked with organizational culture. This article is a review of the role of patient surveys as a quality-improvement tool in health care. The article explores the characteristics, types, merits, and pitfalls of various patient surveys, as well as the impact of their wide-ranging application in dissimilar scenarios to identify gaps in service provision. It is demonstrated that the conducting of patient surveys and using the results to improve the quality of care are two different processes. The value of patient surveys depends on the interplay between these two processes and several other factors that can influence the final outcome. The article also discusses the business aspect of the patient surveys in detail. Finally, the authors make future recommendations on how the patient survey tool can be best used to improve the quality of care in the health-care sector.Keywords: patient surveys, quality improvement, service gaps 

  11. Fourier transform ion cyclotron resonance mass spectrometry: the analytical tool for heavy oil and bitumen characterization

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Thomas B.P; Brown, Melisa; Hsieh, Ben; Larter, Steve [Petroleum Reservoir Group (prg), Department of Geoscience, University of Calgary, Alberta (Canada)

    2011-07-01

    The Fourier Transform Ion Cyclotron Resonance Mass Spectrometry (FTICRMS), developed in the 1970's by Marshall and Comisarow at the University of British Columbia, has become a commercially available tool capable of analyzing several hundred thousand components in a petroleum mixture at once. This analytical technology will probably usher a dramatic revolution in geochemical capability, equal to or greater than the molecular revolution that occurred when GCMS technologies became cheaply available. The molecular resolution and information content given by the FTICRMS petroleum analysis can be compared to the information in the human genome. With current GCMS-based petroleum geochemical protocols perhaps a few hundred components can be quantitatively determined, but with FTICRMS, 1000 times this number of components could possibly be resolved. However, fluid and other properties depend on interaction of this multitude of hydrocarbon and non-hydrocarbon components, not the components themselves, and access to the full potential of this new petroleomics will depend on the definition of this interactome.

  12. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Predictive Analytics to Support Real-Time Management in Pathology Facilities

    Science.gov (United States)

    Lessard, Lysanne; Michalowski, Wojtek; Chen Li, Wei; Amyot, Daniel; Halwani, Fawaz; Banerjee, Diponkar

    2016-01-01

    Predictive analytics can provide valuable support to the effective management of pathology facilities. The introduction of new tests and technologies in anatomical pathology will increase the volume of specimens to be processed, as well as the complexity of pathology processes. In order for predictive analytics to address managerial challenges associated with the volume and complexity increases, it is important to pinpoint the areas where pathology managers would most benefit from predictive capabilities. We illustrate common issues in managing pathology facilities with an analysis of the surgical specimen process at the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital, which processes all surgical specimens for the Eastern Ontario Regional Laboratory Association. We then show how predictive analytics could be used to support management. Our proposed approach can be generalized beyond the DPLM, contributing to a more effective management of pathology facilities and in turn to quicker clinical diagnoses. PMID:28269873

  14. Rapid assessment as an evaluation tool for polio national ...

    African Journals Online (AJOL)

    Rapid assessment as an evaluation tool for polio national immunisation days in Brong Ahafo region, Ghana. ... TM Akande, M Eshetu, G Bonsu ... Conclusion: Rapid assessment is a valuable tool for evaluation of NIDs; it enables timely intervention in covering missed children and helps in careful interpretation of the usual ...

  15. An Educational Tool for Creating Distributed Physical Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    programming for physical games development. This is done by providing an educational tool that allows a change of representation of the problems related to game designing from a virtual to a physical representation. Indeed, MITS seems to be a valuable system for bringing into education a vast number of issues...... (such as parallel programming, distribution, communication protocols, master dependency, connectivity, topology, island modeling software behavioral models, adaptive interactivity, feedback, user and multi-user game interaction, etc.). This can both improve the education-related issues in computer......The development of physical interactive games demands extensive knowledge in engineering, computer science and gaming. In this paper we describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing...

  16. A Web-Based Geovisual Analytical System for Climate Studies

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2012-12-01

    Full Text Available Climate studies involve petabytes of spatiotemporal datasets that are produced and archived at distributed computing resources. Scientists need an intuitive and convenient tool to explore the distributed spatiotemporal data. Geovisual analytical tools have the potential to provide such an intuitive and convenient method for scientists to access climate data, discover the relationships between various climate parameters, and communicate the results across different research communities. However, implementing a geovisual analytical tool for complex climate data in a distributed environment poses several challenges. This paper reports our research and development of a web-based geovisual analytical system to support the analysis of climate data generated by climate model. Using the ModelE developed by the NASA Goddard Institute for Space Studies (GISS as an example, we demonstrate that the system is able to (1 manage large volume datasets over the Internet; (2 visualize 2D/3D/4D spatiotemporal data; (3 broker various spatiotemporal statistical analyses for climate research; and (4 support interactive data analysis and knowledge discovery. This research also provides an example for managing, disseminating, and analyzing Big Data in the 21st century.

  17. Hanford performance evaluation program for Hanford site analytical services

    International Nuclear Information System (INIS)

    Markel, L.P.

    1995-09-01

    The U.S. Department of Energy (DOE) Order 5700.6C, Quality Assurance, and Title 10 of the Code of Federal Regulations, Part 830.120, Quality Assurance Requirements, states that it is the responsibility of DOE contractors to ensure that ''quality is achieved and maintained by those who have been assigned the responsibility for performing the work.'' Hanford Analytical Services Quality Assurance Plan (HASQAP) is designed to meet the needs of the Richland Operations Office (RL) for maintaining a consistent level of quality for the analytical chemistry services provided by contractor and commmercial analytical laboratory operations. Therefore, services supporting Hanford environmental monitoring, environmental restoration, and waste management analytical services shall meet appropriate quality standards. This performance evaluation program will monitor the quality standards of all analytical laboratories supporting the Hanforad Site including on-site and off-site laboratories. The monitoring and evaluation of laboratory performance can be completed by the use of several tools. This program will discuss the tools that will be utilized for laboratory performance evaluations. Revision 0 will primarily focus on presently available programs using readily available performance evaluation materials provided by DOE, EPA or commercial sources. Discussion of project specific PE materials and evaluations will be described in section 9.0 and Appendix A

  18. Analytical chemistry of actinides

    International Nuclear Information System (INIS)

    Chollet, H.; Marty, P.

    2001-01-01

    Different characterization methods specifically applied to the actinides are presented in this review such as ICP/OES (inductively coupled plasma-optical emission spectrometry), ICP/MS (inductively coupled plasma spectroscopy-mass spectrometry), TIMS (thermal ionization-mass spectrometry) and GD/OES (flow discharge optical emission). Molecular absorption spectrometry and capillary electrophoresis are also available to complete the excellent range of analytical tools at our disposal. (authors)

  19. Analytical tools for thermal infrared engineerig: a thermal sensor simulation package

    Science.gov (United States)

    Jaggi, Sandeep

    1992-09-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration. To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering'--ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as SNR, NER, NETD etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters. In addition, ATTIRE can be used as a tutorial for understanding the distribution of thermal flux or solar irradiance over selected bandwidths of the spectrum. This spectrally distributed incident flux can then be analyzed as it propagates through the subsystems that constitute the entire sensor. ATTIRE provides a variety of functions ranging from plotting black-body curves for varying bandwidths and computing the integral flux, to performing transfer function analysis of the sensor system. The package runs from a menu- driven interface in a PC-DOS environment. Each sub-system of the sensor is represented by windows and icons. A user-friendly mouse-controlled point-and-click interface allows the user to simulate various aspects of a sensor. The package can simulate a theoretical sensor system. Trade-off studies can be easily done by changing the appropriate parameters and monitoring the effect of the system performance. The package can provide plots of system performance versus any system parameter. A parameter (such as the entrance aperture of the optics) could be varied and its effect on another parameter (e.g., NETD) can be plotted. A third parameter (e.g., the

  20. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: a.halog@uq.edu.au [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)

    2017-12-20

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  1. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.

    2017-01-01

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  2. Learning analytics as a tool for closing the assessment loop in higher education

    OpenAIRE

    Karen D. Mattingly; Margaret C. Rice; Zane L. Berge

    2012-01-01

    This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and wha...

  3. Column-Oriented Databases, an Alternative for Analytical Environment

    Directory of Open Access Journals (Sweden)

    Gheorghe MATEI

    2010-12-01

    Full Text Available It is widely accepted that a data warehouse is the central place of a Business Intelligence system. It stores all data that is relevant for the company, data that is acquired both from internal and external sources. Such a repository stores data from more years than a transactional system can do, and offer valuable information to its users to make the best decisions, based on accurate and reliable data. As the volume of data stored in an enterprise data warehouse becomes larger and larger, new approaches are needed to make the analytical system more efficient. This paper presents column-oriented databases, which are considered an element of the new generation of DBMS technology. The paper emphasizes the need and the advantages of these databases for an analytical environment and make a short presentation of two of the DBMS built in a columnar approach.

  4. Spectroscopy applied to feed additives of the European Union Reference Laboratory: a valuable tool for traceability.

    Science.gov (United States)

    Omar, Jone; Slowikowski, Boleslaw; Boix, Ana; von Holst, Christoph

    2017-08-01

    Feed additives need to be authorised to be placed on the market according to Regulation (EU) No. 1831/2003. Next to laying down the procedural requirements, the regulation creates the European Union Reference Laboratory for Feed Additives (EURL-FA) and requires that applicants send samples to the EURL-FA. Once authorised, the characteristics of the marketed feed additives should correspond to those deposited in the sample bank of the EURL-FA. For this purpose, the submitted samples were subjected to near-infrared (NIR) and Raman spectroscopy for spectral characterisation. These techniques have the valuable potential of characterising the feed additives in a non-destructive manner without any complicated sample preparation. This paper describes the capability of spectroscopy for a rapid characterisation of products to establish whether specific authorisation criteria are met. This study is based on the analysis of feed additive samples from different categories and functional groups, namely products containing (1) selenium, (2) zinc and manganese, (3) vitamins and (4) essential oils such as oregano and thyme oil. The use of chemometrics turned out to be crucial, especially in cases where the differentiation of spectra by visual inspection was very difficult.

  5. Training the next generation analyst using red cell analytics

    Science.gov (United States)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  6. The Vicinity of Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    between program development tools.  In the central part of the paper we introduce a mapping of documentation flow between program development tools.  In addition we discuss a set of locally developed tools which is related to program documentation.  The use of test cases as examples in an interface......Program documentation plays a vital role in almost all programming processes.  Program documentation flows between separate tools of a modularized environment, and in between the components of an integrated development environment as well.  In this paper we discuss the flow of program documentation...... documentation tool is a noteworthy and valuable contribution to the documentation flow.  As an additional contribution we identify several circular relationships which illustrate feedback of documentation to the program editor from other tools in the development environment....

  7. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  8. Online programs as tools to improve parenting: A meta-analytic review

    NARCIS (Netherlands)

    Nieuwboer, C.C.; Fukkink, R.G.; Hermanns, J.M.A.

    2013-01-01

    Background: A number of parenting programs, aimed at improving parenting competencies, have recently been adapted or designed with the use of online technologies. Although web-based services have been claimed to hold promise for parent support, a meta-analytic review of online parenting

  9. Switchgrass a valuable biomass crop for energy

    CERN Document Server

    2012-01-01

    The demand of renewable energies is growing steadily both from policy and from industry which seeks environmentally friendly feed stocks. The recent policies enacted by the EU, USA and other industrialized countries foresee an increased interest in the cultivation of energy crops; there is clear evidence that switchgrass is one of the most promising biomass crop for energy production and bio-based economy and compounds. Switchgrass: A Valuable Biomass Crop for Energy provides a comprehensive guide to  switchgrass in terms of agricultural practices, potential use and markets, and environmental and social benefits. Considering this potential energy source from its biology, breed and crop physiology to its growth and management to the economical, social and environmental impacts, Switchgrass: A Valuable Biomass Crop for Energy brings together chapters from a range of experts in the field, including a foreword from Kenneth P. Vogel, to collect and present the environmental benefits and characteristics of this a ...

  10. Supporting interactive visual analytics of energy behavior in buildings through affine visualizations

    DEFF Research Database (Denmark)

    Nielsen, Matthias; Brewer, Robert S.; Grønbæk, Kaj

    2016-01-01

    Domain experts dealing with big data are typically not familiar with advanced data mining tools. This especially holds true for domain experts within energy management. In this paper, we introduce a visual analytics approach that empowers such users to visually analyze energy behavior based......Viz, that interactively maps data from real world buildings. It is an overview +detail inter-active visual analytics tool supporting both rapid ad hoc explorations and structured evaluation of hypotheses about patterns and anomalies in resource consumption data mixed with occupant survey data. We have evaluated...... the approach with five domain experts within energy management, and further with 10 data analytics experts and found that it was easily attainable and that it supported visual analysis of mixed consumption and survey data. Finally, we discuss future perspectives of affine visual analytics for mixed...

  11. Valuable human capital: the aging health care worker.

    Science.gov (United States)

    Collins, Sandra K; Collins, Kevin S

    2006-01-01

    With the workforce growing older and the supply of younger workers diminishing, it is critical for health care managers to understand the factors necessary to capitalize on their vintage employees. Retaining this segment of the workforce has a multitude of benefits including the preservation of valuable intellectual capital, which is necessary to ensure that health care organizations maintain their competitive advantage in the consumer-driven market. Retaining the aging employee is possible if health care managers learn the motivators and training differences associated with this category of the workforce. These employees should be considered a valuable resource of human capital because without their extensive expertise, intense loyalty and work ethic, and superior customer service skills, health care organizations could suffer severe economic repercussions in the near future.

  12. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... provides information about e.g. subjects of interest, searching behaviour, browsing patterns in website structure as well as tag clouds, page views. The poster discusses benefits and challenges of the two web metrics, with a model of how to use search and tag data for the design of taxonomies, e.g. choice...

  13. Constraint-Referenced Analytics of Algebra Learning

    Science.gov (United States)

    Sutherland, Scot M.; White, Tobin F.

    2016-01-01

    The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…

  14. Energy threat to valuable land

    International Nuclear Information System (INIS)

    Caufield, C.

    1982-01-01

    Having considered the varying estimates of future UK energy requirements which have been made, the impact on the environment arising from the use of valuable sites for energy production is examined. It is shown that energy installations of all kinds clash with areas of natural beauty or ecological importance. As an example, a recent investigation of potential sites for nuclear power stations found that most of them were on or next to sites of special scientific interest, and other areas officially designated to be regarded as special or to be protected in some way. (U.K.)

  15. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    Science.gov (United States)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  16. Risk analysis of analytical validations by probabilistic modification of FMEA

    DEFF Research Database (Denmark)

    Barends, D.M.; Oldenhof, M.T.; Vredenbregt, M.J.

    2012-01-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection...... and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring...... of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence...

  17. Parasites as valuable stock markers for fisheries in Australasia, East Asia and the Pacific Islands.

    Science.gov (United States)

    Lester, R J G; Moore, B R

    2015-01-01

    Over 30 studies in Australasia, East Asia and the Pacific Islands region have collected and analysed parasite data to determine the ranges of individual fish, many leading to conclusions about stock delineation. Parasites used as biological tags have included both those known to have long residence times in the fish and those thought to be relatively transient. In many cases the parasitological conclusions have been supported by other methods especially analysis of the chemical constituents of otoliths, and to a lesser extent, genetic data. In analysing parasite data, authors have applied multiple different statistical methodologies, including summary statistics, and univariate and multivariate approaches. Recently, a growing number of researchers have found non-parametric methods, such as analysis of similarities and cluster analysis, to be valuable. Future studies into the residence times, life cycles and geographical distributions of parasites together with more robust analytical methods will yield much important information to clarify stock structures in the area.

  18. Big data analytics in immunology: a knowledge-based approach.

    Science.gov (United States)

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  19. Big Data Analytics in Immunology: A Knowledge-Based Approach

    Directory of Open Access Journals (Sweden)

    Guang Lan Zhang

    2014-01-01

    Full Text Available With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  20. Nuclear analytical techniques in Cuban Sugar Industry

    International Nuclear Information System (INIS)

    Diaz Riso, O.; Griffith Martinez, J.

    1996-01-01

    This paper is a review concerning the applications of Nuclear Analytical Techniques in the Cuban sugar industry. The most complete elemental composition of final molasses (34 elements ) and natural zeolites (38) this last one employed as an auxiliary agent in sugar technological processes has been performed by means of Instrumental Neutron Activation Analysis (INAA) and X-Ray Fluorescence Analysis (XRFA). The trace elements sugar cane soil plant relationship and elemental composition of different types of Cuban sugar (rawr, blanco directo and refine) were also studied. As a result, valuable information referred to the possibilities of using these products in animal and human foodstuff so as in other applications are given

  1. Visualizing Cloud Properties and Satellite Imagery: A Tool for Visualization and Information Integration

    Science.gov (United States)

    Chee, T.; Nguyen, L.; Smith, W. L., Jr.; Spangenberg, D.; Palikonda, R.; Bedka, K. M.; Minnis, P.; Thieman, M. M.; Nordeen, M.

    2017-12-01

    Providing public access to research products including cloud macro and microphysical properties and satellite imagery are a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a web based visualization tool and API that allows end users to easily create customized cloud product and satellite imagery, ground site data and satellite ground track information that is generated dynamically. The tool has two uses, one to visualize the dynamically created imagery and the other to provide access to the dynamically generated imagery directly at a later time. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud to accommodate scalability issues. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information, satellite imagery, ground site data and satellite track information accessible and easily searchable. This tool is the culmination of our prior experience with dynamic imagery generation and provides a way to build a "mash-up" of dynamically generated imagery and related kinds of information that are visualized together to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much scientific knowledge, observations and products available to the citizen science, research and interested communities as well as for automated systems to acquire the same information for data mining or other analytic purposes. This tool and the underlying API's provide a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.

  2. Development of the Operational Events Groups Ranking Tool

    International Nuclear Information System (INIS)

    Simic, Zdenko; Banov, Reni

    2014-01-01

    Both because of complexity and ageing, facilities like nuclear power plants require feedback from the operating experience in order to further improve safety and operation performance. That is the reason why significant effort is dedicated to operating experience feedback. This paper contains description of the specification and development of the application for the operating events ranking software tool. Robust and consistent way of selecting most important events for detail investigation is important because it is not feasible or even useful to investigate all of them. Development of the tool is based on the comprehensive events characterisation and methodical prioritization. This includes rich set of events parameters which allow their top level preliminary analysis, different ways of groupings and even to evaluate uncertainty propagation to the ranking results. One distinct feature of the implemented method is that user (i.e., expert) could determine how important is particular ranking parameter based on their pairwise comparison. For tools demonstration and usability it is crucial that sample database is also created. For useful analysis the whole set of events for 5 years is selected and characterised. Based on the preliminary results this tool seems valuable for new preliminary prospective on data as whole, and especially for the identification of events groups which should have priority in the more detailed assessment. The results are consisting of different informative views on the events groups importance and related sensitivity and uncertainty results. This presents valuable tool for improving overall picture about specific operating experience and also for helping identify the most important events groups for further assessment. It is clear that completeness and consistency of the input data characterisation is very important to get full and valuable importance ranking. Method and tool development described in this paper is part of continuous effort of

  3. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  4. Getting started with Greenplum for big data analytics

    CERN Document Server

    Gollapudi, Sunila

    2013-01-01

    Standard tutorial-based approach.""Getting Started with Greenplum for Big Data"" Analytics is great for data scientists and data analysts with a basic knowledge of Data Warehousing and Business Intelligence platforms who are new to Big Data and who are looking to get a good grounding in how to use the Greenplum Platform. It's assumed that you will have some experience with database design and programming as well as be familiar with analytics tools like R and Weka.

  5. European multicenter analytical evaluation of the Abbott ARCHITECT STAT high sensitive troponin I immunoassay.

    Science.gov (United States)

    Krintus, Magdalena; Kozinski, Marek; Boudry, Pascal; Capell, Nuria Estañ; Köller, Ursula; Lackner, Karl; Lefèvre, Guillaume; Lennartz, Lieselotte; Lotz, Johannes; Herranz, Antonio Mora; Nybo, Mads; Plebani, Mario; Sandberg, Maria B; Schratzberger, Wolfgang; Shih, Jessie; Skadberg, Øyvind; Chargui, Ahmed Taoufik; Zaninotto, Martina; Sypniewska, Grazyna

    2014-11-01

    International recommendations highlight the superior value of cardiac troponins (cTns) for early diagnosis of myocardial infarction along with analytical requirements of improved precision and detectability. In this multicenter study, we investigated the analytical performance of a new high sensitive cardiac troponin I (hs-cTnI) assay and its 99th percentile upper reference limit (URL). Laboratories from nine European countries evaluated the ARCHITECT STAT high sensitive troponin I (hs-TnI) immunoassay on the ARCHITECT i2000SR/i1000SR immunoanalyzers. Imprecision, limit of blank (LoB), limit of detection (LoD), limit of quantitation (LoQ) linearity of dilution, interferences, sample type, method comparisons, and 99th percentile URLs were evaluated in this study. Total imprecision of 3.3%-8.9%, 2.0%-3.5% and 1.5%-5.2% was determined for the low, medium and high controls, respectively. The lowest cTnI concentration corresponding to a total CV of 10% was 5.6 ng/L. Common interferences, sample dilution and carryover did not affect the hs-cTnI results. Slight, but statistically significant, differences with sample type were found. Concordance between the investigated hs-cTnI assay and contemporary cTnI assay at 99th percentile cut-off was found to be 95%. TnI was detectable in 75% and 57% of the apparently healthy population using the lower (1.1 ng/L) and upper (1.9 ng/L) limit of the LoD range provided by the ARCHITECT STAT hs-TnI package insert, respectively. The 99th percentile values were gender dependent. The new ARCHITECT STAT hs-TnI assay with improved analytical features meets the criteria of high sensitive Tn test and will be a valuable diagnostic tool.

  6. Customer Intelligence Analytics on Social Networks

    Directory of Open Access Journals (Sweden)

    Brano MARKIĆ

    2016-08-01

    Full Text Available Discovering needs, habits and consumer behavior is the primary task of marketing analytics. It is necessary to integrate marketing and analytical skills with IT skills. Such knowledge integration allows access to data (structured and unstructured, their analysis and finding out information about the opinions, attitudes, needs and behavior of customers. In the paper is set the hypothesis that software tools can collect data (messages from social networks, analyze the content of messages and get to know the attitudes of customers about a product, service, tourist destination with the ultimate goal of improving customer relations. Experimental results are based on the analysis of the content of social network Facebook by using the package and function R language. This language showed a satisfactory application and development power in analysis of textual data on social networks for marketing analytics.

  7. Data Mining Tools in Science Education

    OpenAIRE

    Premysl Zaskodny

    2012-01-01

    The main principle of paper is Data Mining in Science Education (DMSE) as Problem Solving. The main goal of paper is consisting in Delimitation of Complex Data Mining Tool and Partial Data Mining Tool of DMSE. The procedure of paper is consisting of Data Preprocessing in Science Education, Data Processing in Science Education, Description of Curricular Process as Complex Data Mining Tool (CP-DMSE), Description of Analytical Synthetic Modeling as Partial Data Mining Tool (ASM-DMSE) and finally...

  8. Preparing valuable hydrocarbons by hydrogenation

    Energy Technology Data Exchange (ETDEWEB)

    Pier, M

    1930-08-22

    A process is described for the preparation of valuable hydrocarbons by treatment of carbonaceous materials, like coal, tars, minerals oils, and their distillation and conversion products, and for refining of liquid hydrocarbon mixture obtained at raised temperature and under pressure, preferably in the presence of catalysts, by the use of hydrogen-containing gases, purified and obtained by distilling solid combustibles, characterized by the purification of the hydrogen-containing gases being accomplished for the purpose of practically complete removal of the oxygen by heating at ordinary or higher pressure in the presence of a catalyst containing silver and oxides of metals of group VI of the periodic system.

  9. Real-time imaging as an emerging process analytical technology tool for monitoring of fluid bed coating process.

    Science.gov (United States)

    Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S

    2018-07-01

    A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.

  10. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  11. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    Science.gov (United States)

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  12. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    Science.gov (United States)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la

  13. Analytical chemistry experiment

    International Nuclear Information System (INIS)

    Park, Seung Jo; Paeng, Seong Gwan; Jang, Cheol Hyeon

    1992-08-01

    This book deals with analytical chemistry experiment with eight chapters. It explains general matters that require attention on experiment, handling of medicine with keep and class, the method for handling and glass devices, general control during experiment on heating, cooling, filtering, distillation and extraction and evaporation and dry, glass craft on purpose of the craft, how to cut glass tube and how to bend glass tube, volumetric analysis on neutralization titration and precipitation titration, gravimetric analysis on solubility product, filter and washing and microorganism experiment with necessary tool, sterilization disinfection incubation and appendixes.

  14. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  15. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    International Nuclear Information System (INIS)

    Silvestre-Brac, Bernard; Semay, Claude; Buisseret, Fabien

    2008-01-01

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature

  16. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre-Brac, Bernard [LPSC Universite Joseph Fourier, Grenoble 1, CNRS/IN2P3, Institut Polytechnique de Grenoble, Avenue des Martyrs 53, F-38026 Grenoble-Cedex (France); Semay, Claude; Buisseret, Fabien [Groupe de Physique Nucleaire Theorique, Universite de Mons-Hainaut, Academie universitaire Wallonie-Bruxelles, Place du Parc 20, B-7000 Mons (Belgium)], E-mail: silvestre@lpsc.in2p3.fr, E-mail: claude.semay@umh.ac.be, E-mail: fabien.buisseret@umh.ac.be

    2008-07-11

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature.

  17. The challenge of big data in public health: an opportunity for visual analytics.

    Science.gov (United States)

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  18. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    Science.gov (United States)

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Process simulation of heavy water plants - a powerful analytical tool

    International Nuclear Information System (INIS)

    Miller, A.I.

    1978-10-01

    The commercially conscious designs of Canadian GS (Girdler-Sulphide) have proved sensitive to process conditions. That, combined with the large scale of our units, has meant that computer simulation of their behaviour has been a natural and profitable development. Atomic Energy of Canada Limited has developed a family of steady state simulations to describe all of the Canadian plants. Modelling of plant conditions has demonstrated that the simulation description is very precise and it has become an integral part of the industry's assessments of both plant operation and decisions on capital expenditures. The simulation technique has also found extensive use in detailed designing of both the rehabilitated Glace Bay and the new La Prade plants. It has opened new insights into plant design and uncovered a radical and significant flowsheet change for future designs as well as many less dramatic but valuable lesser changes. (author)

  20. The Earth Data Analytic Services (EDAS) Framework

    Science.gov (United States)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  1. Analytical study in 1D nuclear waste migration

    International Nuclear Information System (INIS)

    Perez Guerrero, Jesus S.; Heilbron Filho, Paulo L.; Romani, Zrinka V.

    1999-01-01

    The simulation of the nuclear waste migration phenomena are governed mainly by diffusive-convective equation that includes the effects of hydrodynamic dispersion (mechanical dispersion and molecular diffusion), radioactive decay and chemical interaction. For some special problems (depending on the boundary conditions and when the domain is considered infinite or semi-infinite) an analytical solution may be obtained using classical analytical methods such as Laplace Transform or variable separation. The hybrid Generalized Integral Transform Technique (GITT) is a powerful tool that can be applied to solve diffusive-convective linear problems to obtain formal analytical solutions. The aim of this work is to illustrate that the GITT may be used to obtain an analytical formal solution for the study of migration of radioactive waste in saturated flow porous media. A case test considering 241 Am radionuclide is presented. (author)

  2. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz; Husain, Iqbal; Muljadi, Eduard

    2017-02-16

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variables that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.

  3. Fabrication of SU-8 microstructures for analytical microfluidic applications

    OpenAIRE

    Tuomikoski, Santeri

    2007-01-01

    Miniaturization of analytical devices has been an ongoing trend to improve performance of analytical tools. These systems have been microfabricated originally of silicon and glass, but polymers have become increasingly popular as alternative materials. Polymers are mostly used because the material costs are lower and fabrication processes are easier. However, those facts depend heavily on the fabrication method and particular polymer. In this thesis the usability of epoxy-polymer SU-8 has bee...

  4. Effective 1.0: An Analytic Effective Action Analysis Library

    OpenAIRE

    Hetherington, James P. J.; Stephens, Philip

    2006-01-01

    Effective is a C++ library which provides the user a toolbox to study the effective action of an arbitrary field theory. From the field content, gauge groups and representations an appropriate action is generated symbolically. The effective potential, mass spectrum, field couplings and vacuum expectation values are then obtained automatically; tree level results are obtained analytically while many tools, both numeric and analytic, provide a variety of approaches to deal with the one-loop cor...

  5. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  6. Improving web site performance using commercially available analytical tools.

    Science.gov (United States)

    Ogle, James A

    2010-10-01

    It is easy to accurately measure web site usage and to quantify key parameters such as page views, site visits, and more complex variables using commercially available tools that analyze web site log files and search engine use. This information can be used strategically to guide the design or redesign of a web site (templates, look-and-feel, and navigation infrastructure) to improve overall usability. The data can also be used tactically to assess the popularity and use of new pages and modules that are added and to rectify problems that surface. This paper describes software tools used to: (1) inventory search terms that lead to available content; (2) propose synonyms for commonly used search terms; (3) evaluate the effectiveness of calls to action; (4) conduct path analyses to targeted content. The American Academy of Orthopaedic Surgeons (AAOS) uses SurfRay's Behavior Tracking software (Santa Clara CA, USA, and Copenhagen, Denmark) to capture and archive the search terms that have been entered into the site's Google Mini search engine. The AAOS also uses Unica's NetInsight program to analyze its web site log files. These tools provide the AAOS with information that quantifies how well its web sites are operating and insights for making improvements to them. Although it is easy to quantify many aspects of an association's web presence, it also takes human involvement to analyze the results and then recommend changes. Without a dedicated resource to do this, the work often is accomplished only sporadically and on an ad hoc basis.

  7. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  8. Plasma tissue inhibitor of metalloproteinases-1 as a biological marker? Pre-analytical considerations

    DEFF Research Database (Denmark)

    Lomholt, Anne Fog; Frederiksen, Camilla; Christensen, Ib Jarle

    2007-01-01

    Tissue Inhibitor of Metalloproteinases-1 (TIMP-1) may be a valuable biological marker in Colorectal Cancer (CRC). However, prospective validation of TIMP-1 as a biological marker should include a series of pre-analytical considerations. TIMP-1 is stored in platelets, which may degranulate during ...... collection and storage. The aim of this study was to evaluate the influence of platelet TIMP-1 contamination on plasma TIMP-1 levels in healthy volunteers....

  9. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  10. OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.

    Science.gov (United States)

    Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein

    2018-01-01

    Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.

  11. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  12. Case Study: IBM Watson Analytics Cloud Platform as Analytics-as-a-Service System for Heart Failure Early Detection

    Directory of Open Access Journals (Sweden)

    Gabriele Guidi

    2016-07-01

    Full Text Available In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detecting the presence or absence of Heart Failure disease using nothing more than the electrocardiographic signal, in particular through the analysis of Heart Rate Variability. The obtained results are comparable with those coming from the literature, in terms of accuracy and predictive power. Advantages and drawbacks of cloud versus static approaches are discussed in the last sections.

  13. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    Science.gov (United States)

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  15. Forecasting Hotspots-A Predictive Analytics Approach.

    Science.gov (United States)

    Maciejewski, R; Hafen, R; Rudolph, S; Larew, S G; Mitchell, M A; Cleveland, W S; Ebert, D S

    2011-04-01

    Current visual analytics systems provide users with the means to explore trends in their data. Linked views and interactive displays provide insight into correlations among people, events, and places in space and time. Analysts search for events of interest through statistical tools linked to visual displays, drill down into the data, and form hypotheses based upon the available information. However, current systems stop short of predicting events. In spatiotemporal data, analysts are searching for regions of space and time with unusually high incidences of events (hotspots). In the cases where hotspots are found, analysts would like to predict how these regions may grow in order to plan resource allocation and preventative measures. Furthermore, analysts would also like to predict where future hotspots may occur. To facilitate such forecasting, we have created a predictive visual analytics toolkit that provides analysts with linked spatiotemporal and statistical analytic views. Our system models spatiotemporal events through the combination of kernel density estimation for event distribution and seasonal trend decomposition by loess smoothing for temporal predictions. We provide analysts with estimates of error in our modeling, along with spatial and temporal alerts to indicate the occurrence of statistically significant hotspots. Spatial data are distributed based on a modeling of previous event locations, thereby maintaining a temporal coherence with past events. Such tools allow analysts to perform real-time hypothesis testing, plan intervention strategies, and allocate resources to correspond to perceived threats.

  16. Comparative analytics of infusion pump data across multiple hospital systems.

    Science.gov (United States)

    Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith

    2015-02-15

    A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  17. Production of Fatty Acid-Derived Valuable Chemicals in Synthetic Microbes

    International Nuclear Information System (INIS)

    Yu, Ai-Qun; Pratomo Juwono, Nina Kurniasih; Leong, Susanna Su Jan; Chang, Matthew Wook

    2014-01-01

    Fatty acid derivatives, such as hydroxy fatty acids, fatty alcohols, fatty acid methyl/ethyl esters, and fatty alka(e)nes, have a wide range of industrial applications including plastics, lubricants, and fuels. Currently, these chemicals are obtained mainly through chemical synthesis, which is complex and costly, and their availability from natural biological sources is extremely limited. Metabolic engineering of microorganisms has provided a platform for effective production of these valuable biochemicals. Notably, synthetic biology-based metabolic engineering strategies have been extensively applied to refactor microorganisms for improved biochemical production. Here, we reviewed: (i) the current status of metabolic engineering of microbes that produce fatty acid-derived valuable chemicals, and (ii) the recent progress of synthetic biology approaches that assist metabolic engineering, such as mRNA secondary structure engineering, sensor-regulator system, regulatable expression system, ultrasensitive input/output control system, and computer science-based design of complex gene circuits. Furthermore, key challenges and strategies were discussed. Finally, we concluded that synthetic biology provides useful metabolic engineering strategies for economically viable production of fatty acid-derived valuable chemicals in engineered microbes.

  18. Production of Fatty Acid-Derived Valuable Chemicals in Synthetic Microbes

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Ai-Qun; Pratomo Juwono, Nina Kurniasih [Department of Biochemistry, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (Singapore); Synthetic Biology Research Program, National University of Singapore, Singapore (Singapore); Leong, Susanna Su Jan [Department of Biochemistry, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (Singapore); Synthetic Biology Research Program, National University of Singapore, Singapore (Singapore); Singapore Institute of Technology, Singapore (Singapore); Chang, Matthew Wook, E-mail: bchcmw@nus.edu.sg [Department of Biochemistry, Yong Loo Lin School of Medicine, National University of Singapore, Singapore (Singapore); Synthetic Biology Research Program, National University of Singapore, Singapore (Singapore)

    2014-12-23

    Fatty acid derivatives, such as hydroxy fatty acids, fatty alcohols, fatty acid methyl/ethyl esters, and fatty alka(e)nes, have a wide range of industrial applications including plastics, lubricants, and fuels. Currently, these chemicals are obtained mainly through chemical synthesis, which is complex and costly, and their availability from natural biological sources is extremely limited. Metabolic engineering of microorganisms has provided a platform for effective production of these valuable biochemicals. Notably, synthetic biology-based metabolic engineering strategies have been extensively applied to refactor microorganisms for improved biochemical production. Here, we reviewed: (i) the current status of metabolic engineering of microbes that produce fatty acid-derived valuable chemicals, and (ii) the recent progress of synthetic biology approaches that assist metabolic engineering, such as mRNA secondary structure engineering, sensor-regulator system, regulatable expression system, ultrasensitive input/output control system, and computer science-based design of complex gene circuits. Furthermore, key challenges and strategies were discussed. Finally, we concluded that synthetic biology provides useful metabolic engineering strategies for economically viable production of fatty acid-derived valuable chemicals in engineered microbes.

  19. Tool to Prioritize Energy Efficiency Investments

    Energy Technology Data Exchange (ETDEWEB)

    Farese, Philip [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gelman, Rachel [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hendron, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2012-08-01

    To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.

  20. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    Science.gov (United States)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  1. Evaluating Modeling Sessions Using the Analytic Hierarchy Process

    NARCIS (Netherlands)

    Ssebuggwawo, D.; Hoppenbrouwers, S.J.B.A.; Proper, H.A.; Persson, A.; Stirna, J.

    2008-01-01

    In this paper, which is methodological in nature, we propose to use an established method from the field of Operations Research, the Analytic Hierarchy Process (AHP), in the integrated, stakeholder- oriented evaluation of enterprise modeling sessions: their language, pro- cess, tool (medium), and

  2. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  3. The Spectrum of Learning Analytics

    Directory of Open Access Journals (Sweden)

    Gerd Kortemeyer

    2017-06-01

    Full Text Available "Learning Analytics" became a buzzword during the hype surrounding the advent of "big data" MOOCs, however, the concept has been around for over two decades. When the first online courses became available it was used as a tool to increase student success in particular courses, frequently combined with the hope of conducting educational research. In recent years, the same term started to be used on the institutional level to increase retention and decrease time-to-degree. These two applications, within particular courses on the one hand and at the institutional level on the other, are at the two extremes of the spectrum of Learning Analytics – and they frequently appear to be worlds apart. The survey describes affordances, theories and approaches in these two categories.

  4. Next-Generation Tools For Next-Generation Surveys

    Science.gov (United States)

    Murray, S. G.

    2017-04-01

    The next generation of large-scale galaxy surveys, across the electromagnetic spectrum, loom on the horizon as explosively game-changing datasets, in terms of our understanding of cosmology and structure formation. We are on the brink of a torrent of data that is set to both confirm and constrain current theories to an unprecedented level, and potentially overturn many of our conceptions. One of the great challenges of this forthcoming deluge is to extract maximal scientific content from the vast array of raw data. This challenge requires not only well-understood and robust physical models, but a commensurate network of software implementations with which to efficiently apply them. The halo model, a semi-analytic treatment of cosmological spatial statistics down to nonlinear scales, provides an excellent mathematical framework for exploring the nature of dark matter. This thesis presents a next-generation toolkit based on the halo model formalism, intended to fulfil the requirements of next-generation surveys. Our toolkit comprises three tools: (i) hmf, a comprehensive and flexible calculator for halo mass functions (HMFs) within extended Press-Schechter theory, (ii) the MRP distribution for extremely efficient analytic characterisation of HMFs, and (iii) halomod, an extension of hmf which provides support for the full range of halo model components. In addition to the development and technical presentation of these tools, we apply each to the task of physical modelling. With hmf, we determine the precision of our knowledge of the HMF, due to uncertainty in our knowledge of the cosmological parameters, over the past decade of cosmic microwave background (CMB) experiments. We place rule-of-thumb uncertainties on the predicted HMF for the Planck cosmology, and find that current limits on the precision are driven by modeling uncertainties rather than those from cosmological parameters. With the MRP, we create and test a method for robustly fitting the HMF to observed

  5. Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning tools like Spark, Jupyter, R, S...

  6. Practical applications of surface analytic tools in tribology

    Science.gov (United States)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  7. Managing knowledge business intelligence: A cognitive analytic approach

    Science.gov (United States)

    Surbakti, Herison; Ta'a, Azman

    2017-10-01

    The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.

  8. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  9. New Therapies Offer Valuable Options for Patients with Melanoma

    Science.gov (United States)

    Two phase III clinical trials of new therapies for patients with metastatic melanoma presented in June at the 2011 ASCO conference confirmed that vemurafenib and ipilimumab (Yervoy™) offer valuable new options for the disease.

  10. Temperature based validation of the analytical model for the estimation of the amount of heat generated during friction stir welding

    Directory of Open Access Journals (Sweden)

    Milčić Dragan S.

    2012-01-01

    Full Text Available Friction stir welding is a solid-state welding technique that utilizes thermomechanical influence of the rotating welding tool on parent material resulting in a monolith joint - weld. On the contact of welding tool and parent material, significant stirring and deformation of parent material appears, and during this process, mechanical energy is partially transformed into heat. Generated heat affects the temperature of the welding tool and parent material, thus the proposed analytical model for the estimation of the amount of generated heat can be verified by temperature: analytically determined heat is used for numerical estimation of the temperature of parent material and this temperature is compared to the experimentally determined temperature. Numerical solution is estimated using the finite difference method - explicit scheme with adaptive grid, considering influence of temperature on material's conductivity, contact conditions between welding tool and parent material, material flow around welding tool, etc. The analytical model shows that 60-100% of mechanical power given to the welding tool is transformed into heat, while the comparison of results shows the maximal relative difference between the analytical and experimental temperature of about 10%.

  11. Recent Development in Big Data Analytics for Business Operations and Risk Management.

    Science.gov (United States)

    Choi, Tsan-Ming; Chan, Hing Kai; Yue, Xiaohang

    2017-01-01

    "Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.

  12. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    Science.gov (United States)

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Sign Language Legislation as a Tool for Sustainability

    Science.gov (United States)

    Pabsch, Annika

    2017-01-01

    This article explores three models of sustainability (environmental, economic, and social) and identifies characteristics of a sustainable community necessary to sustain the Deaf community as a whole. It is argued that sign language legislation is a valuable tool for achieving sustainability for the generations to come.

  14. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    Science.gov (United States)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  15. A case report on inVALUABLE: insect value chain in a circular bioeconomy

    DEFF Research Database (Denmark)

    Heckmann, L.-H.; Andersen, J.L.; Eilenberg, J.

    2018-01-01

    partners span the entire value chain and include entrepreneurs, experts in biology, biotechnology, automation, processing and food tech and safety. This paper provides an overview of the goal, activities and some preliminary results obtained during the first year of the project.......The vision of inVALUABLE is to create a sustainable resource-efficient industry for animal production based on insects. inVALUABLE has focus on the R&D demand for scaling up production of insects in Denmark and assessing the application potential of particularly mealworms. The inVALUABLE consortium...

  16. Which energy mix for the UK (United Kingdom)? An evolutive descriptive mapping with the integrated GAIA (graphical analysis for interactive aid)–AHP (analytic hierarchy process) visualization tool

    International Nuclear Information System (INIS)

    Ishizaka, Alessio; Siraj, Sajid; Nemery, Philippe

    2016-01-01

    Although Multi-Criteria Decision Making methods have been extensively used in energy planning, their descriptive use has been rarely considered. In this paper, we add an evolutionary description phase as an extension to the AHP (analytic hierarchy process) method that helps policy makers to gain insights into their decision problems. The proposed extension has been implemented in an open-source software that allows the users to visualize the difference of opinions within a decision process, and also the evolution of preferences over time. The method was tested in a two-phase experiment to understand the evolution of opinions on energy sources. Participants were asked to provide their preferences for different energy sources for the next twenty years for the United Kingdom. They were first asked to compare the options intuitively without using any structured approach, and then were given three months to compare the same set of options after collecting detailed information on the technical, economic, environmental and social impacts created by each of the selected energy sources. The proposed visualization method allow us to quickly discover the preference directions, and also the changes in their preferences from first to second phase. The proposed tool can help policy makers in better understanding of the energy planning problems that will lead us towards better planning and decisions in the energy sector. - Highlights: • We introduce a descriptive visual analysis tool for the analytic hierarchy process. • The method has been implemented as an open-source preference elicitation tool. • We analyse user preferences in the energy sector using this method. • The tool also provides a way to visualize temporal preferences changes. • The main negative temporal shift in the ranking was found for the nuclear energy.

  17. Design and implementation of a web-based PET-CT reporting assessment and e-portfolio tool

    International Nuclear Information System (INIS)

    Subesinghe, M.; Goldstone, A.R.; Patel, C.N.; Chowdhury, F.U.; Scarsbrook, A.F.

    2015-01-01

    Highlights: • We describe a simple internet-based reporting tool to enhance PET-CT training. • Automatically created competency based metrics are valuable in monitoring progress. • This tool provides robust evidence of competency in PET-CT reporting

  18. Multidimensional integral representations problems of analytic continuation

    CERN Document Server

    Kytmanov, Alexander M

    2015-01-01

    The monograph is devoted to integral representations for holomorphic functions in several complex variables, such as Bochner-Martinelli, Cauchy-Fantappiè, Koppelman, multidimensional logarithmic residue etc., and their boundary properties. The applications considered are problems of analytic continuation of functions from the boundary of a bounded domain in C^n. In contrast to the well-known Hartogs-Bochner theorem, this book investigates functions with the one-dimensional property of holomorphic extension along complex lines, and includes the problems of receiving multidimensional boundary analogs of the Morera theorem.   This book is a valuable resource for specialists in complex analysis, theoretical physics, as well as graduate and postgraduate students with an understanding of standard university courses in complex, real and functional analysis, as well as algebra and geometry.

  19. Nuclear analytical techniques in Cuban sugar industry

    International Nuclear Information System (INIS)

    Diaz R, O.; Griffith M, J.

    1997-01-01

    This paper is a review concerning the application of Nuclear Analytical Techniques in the Cuban sugar industry. The most complete elemental composition of final molasses (34 elements) and natural zeolites (38) this last one employed as an auxiliary agent in sugar technological processe4s has been performed by means of instrumental Neutron Activation Analysis (INAA) and X-Ray Fluorescence Analysis (XRFA). The trace elemental sugar cane soill-plant relationship and elemental composition of different types of Cuban sugar (raw, blanco-directo and refine) were also studied. As a result, valuable information referred to the possibilities of using these products in animal and human foodstuff so as in the other applications are given. (author). 34 refs., 6 figs., 1 tab

  20. Big Data as a Revolutionary Tool in Finance

    Directory of Open Access Journals (Sweden)

    Aureliano Angel Bressan

    2015-08-01

    Full Text Available A data driven culture is arising as a research field and analytic tool in Finance and Management since the advent of structured, semi-structured and unstructured socio-economic and demographic information from social media, mobile devices, blogs and product reviews from consumers. Big Data, the expression that encompasses this revolution, involves the usage of new tools for financial professionals and academic researchers due to the size of data involved, which require more powerful manipulation tools. In this sense, Machine Learning techniques can allow more effective ways to model complex relationships that arise from the interaction of different types of data, regarding issues such as Operational and Reputational Risk, Portfolio Management, Business Intelligence and Predictive Analytics. The following books can be a good start for those interested in this new field.

  1. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).

  2. Process Analytical Technology for High Shear Wet Granulation: Wet Mass Consistency Reported by In-Line Drag Flow Force Sensor Is Consistent With Powder Rheology Measured by At-Line FT4 Powder Rheometer.

    Science.gov (United States)

    Narang, Ajit S; Sheverev, Valery; Freeman, Tim; Both, Douglas; Stepaniuk, Vadim; Delancy, Michael; Millington-Smith, Doug; Macias, Kevin; Subramanian, Ganeshkumar

    2016-01-01

    Drag flow force (DFF) sensor that measures the force exerted by wet mass in a granulator on a thin cylindrical probe was shown as a promising process analytical technology for real-time in-line high-resolution monitoring of wet mass consistency during high shear wet granulation. Our previous studies indicated that this process analytical technology tool could be correlated to granulation end point established independently through drug product critical quality attributes. In this study, the measurements of flow force by a DFF sensor, taken during wet granulation of 3 placebo formulations with different binder content, are compared with concurrent at line FT4 Powder Rheometer characterization of wet granules collected at different time points of the processing. The wet mass consistency measured by the DFF sensor correlated well with the granulation's resistance to flow and interparticulate interactions as measured by FT4 Powder Rheometer. This indicated that the force pulse magnitude measured by the DFF sensor was indicative of fundamental material properties (e.g., shear viscosity and granule size/density), as they were changing during the granulation process. These studies indicate that DFF sensor can be a valuable tool for wet granulation formulation and process development and scale up, as well as for routine monitoring and control during manufacturing. Copyright © 2016. Published by Elsevier Inc.

  3. An analytical simulation technique for cone-beam CT and pinhole SPECT

    International Nuclear Information System (INIS)

    Zhang Xuezhu; Qi Yujin

    2011-01-01

    This study was aimed at developing an efficient simulation technique with an ordinary PC. The work involved derivation of mathematical operators, analytic phantom generations, and effective analytical projectors developing for cone-beam CT and pinhole SPECT imaging. The computer simulations based on the analytical projectors were developed by ray-tracing method for cone-beam CT and voxel-driven method for pinhole SPECT of degrading blurring. The 3D Shepp-Logan, Jaszczak and Defrise phantoms were used for simulation evaluations and image reconstructions. The reconstructed phantom images were of good accuracy with the phantoms. The results showed that the analytical simulation technique is an efficient tool for studying cone-beam CT and pinhole SPECT imaging. (authors)

  4. Learning, Learning Analytics, Activity Visualisation and Open learner Model

    DEFF Research Database (Denmark)

    Bull, Susan; Kickmeier-Rust, Michael; Vatrapu, Ravi

    2013-01-01

    This paper draws on visualisation approaches in learning analytics, considering how classroom visualisations can come together in practice. We suggest an open learner model in situations where many tools and activity visualisations produce more visual information than can be readily interpreted....

  5. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  6. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  7. Predictive Data Tools Find Uses in Schools

    Science.gov (United States)

    Sparks, Sarah D.

    2011-01-01

    The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…

  8. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Human Functions, Machine Tools, and the Role of the Analyst

    Directory of Open Access Journals (Sweden)

    Gordon R. Middleton

    2015-09-01

    Full Text Available In an era of rapidly increasing technical capability, the intelligence focus is often on the modes of collection and tools of analysis rather than the analyst themselves. Data are proliferating and so are tools to help analysts deal with the flood of data and the increasingly demanding timeline for intelligence production, but the role of the analyst in such a data-driven environment needs to be understood in order to support key management decisions (e.g., training and investment priorities. This paper describes a model of the analytic process, and analyzes the roles played by humans and machine tools in each process element. It concludes that human analytic functions are as critical in the intelligence process as they have ever been, and perhaps even more so due to the advance of technology in the intelligence business. Human functions performed by analysts are critical in nearly every step in the process, particularly at the front end of the analytic process, in defining and refining the problem statement, and at the end of the process, in generating knowledge, presenting the story in understandable terms, tailoring the presentation of the results of the analysis to various audiences, as well as in determining when to initiate iterative loops in the process. The paper concludes with observations on the necessity of enabling expert analysts, tools to deal with big data, developing analysts with advanced analytic methods as well as with techniques for optimal use of advanced tools, and suggestions for further quantitative research.

  10. ‘Positioning’ in the conversation analytic approach

    DEFF Research Database (Denmark)

    Day, Dennis; Kjærbeck, Susanne

    2013-01-01

    of ‘positioning’ is used at all levels of analysis in the former, there appears to be no such analytical concept in EM/CA. The aim of this article is to inquire if EM/CA tools for the analysis of identities and relations in talk might be considered interesting from the perspective of positioning theory. To do so...

  11. Analytical solution for Van der Pol-Duffing oscillators

    International Nuclear Information System (INIS)

    Kimiaeifar, A.; Saidi, A.R.; Bagheri, G.H.; Rahimpour, M.; Domairry, D.G.

    2009-01-01

    In this paper, the problem of single-well, double-well and double-hump Van der Pol-Duffing oscillator is studied. Governing equation is solved analytically using a new kind of analytic technique for nonlinear problems namely the 'Homotopy Analysis Method' (HAM), for the first time. Present solution gives an expression which can be used in wide range of time for all domain of response. Comparisons of the obtained solutions with numerical results show that this method is effective and convenient for solving this problem. This method is a capable tool for solving this kind of nonlinear problems.

  12. Analytical strategic environmental assessment (ANSEA) developing a new approach to SEA

    International Nuclear Information System (INIS)

    Dalkmann, Holger; Herrera, Rodrigo Jiliberto; Bongardt, Daniel

    2004-01-01

    The objective of analytical strategic environmental assessment (ANSEA) is to provide a decision-centred approach to the SEA process. The ANSEA project evolved from the realisation that, in many cases, SEA, as currently practised, is not able to ensure an appropriate integration of environmental values. The focus of SEA is on predicting impacts, but the tool takes no account of the decision-making processes it is trying to influence. At strategic decision-making levels, in turn, it is often difficult to predict impacts with the necessary exactitude. The decision-making sciences could teach some valuable lessons here. Instead of focusing on the quantitative prediction of environmental consequences, the ANSEA approach concentrates on the integration of environmental objectives into decision-making processes. Thus, the ANSEA approach provides a framework for analysing and assessing the decision-making processes of policies, plans and programmes (PPP). To enhance environmental integration into the decision-making process, decision windows (DW) can be identified. The approach is designed to be objective and transparent to ensure that environmental considerations are taken into account, or--from an ex-post perspective--to allow an evaluation of how far environmental considerations have been integrated into the decision-making process under assessment. The paper describes the concepts and the framework of the ANSEA approach and discusses its relation to SEA and the EC Directive

  13. Multiplier ideal sheaves and analytic methods in algebraic geometry

    International Nuclear Information System (INIS)

    Demailly, J.-P.

    2001-01-01

    Our main purpose here is to describe a few analytic tools which are useful to study questions such as linear series and vanishing theorems for algebraic vector bundles. One of the early successes of analytic methods in this context is Kodaira's use of the Bochner technique in relation with the theory of harmonic forms, during the decade 1950-60.The idea is to represent cohomology classes by harmonic forms and to prove vanishing theorems by means of suitable a priori curvature estimates. We pursue the study of L2 estimates, in relation with the Nullstellenstatz and with the extension problem. We show how subadditivity can be used to derive an approximation theorem for (almost) plurisubharmonic functions: any such function can be approximated by a sequence of (almost) plurisubharmonic functions which are smooth outside an analytic set, and which define the same multiplier ideal sheaves. From this, we derive a generalized version of the hard Lefschetz theorem for cohomology with values in a pseudo-effective line bundle; namely, the Lefschetz map is surjective when the cohomology groups are twisted by the relevant multiplier ideal sheaves. These notes are essentially written with the idea of serving as an analytic tool- box for algebraic geometers. Although efficient algebraic techniques exist, our feeling is that the analytic techniques are very flexible and offer a large variety of guidelines for more algebraic questions (including applications to number theory which are not discussed here). We made a special effort to use as little prerequisites and to be as self-contained as possible; hence the rather long preliminary sections dealing with basic facts of complex differential geometry

  14. Multiplier ideal sheaves and analytic methods in algebraic geometry

    Energy Technology Data Exchange (ETDEWEB)

    Demailly, J -P [Universite de Grenoble I, Institut Fourier, Saint-Martin d' Heres (France)

    2001-12-15

    Our main purpose here is to describe a few analytic tools which are useful to study questions such as linear series and vanishing theorems for algebraic vector bundles. One of the early successes of analytic methods in this context is Kodaira's use of the Bochner technique in relation with the theory of harmonic forms, during the decade 1950-60.The idea is to represent cohomology classes by harmonic forms and to prove vanishing theorems by means of suitable a priori curvature estimates. We pursue the study of L2 estimates, in relation with the Nullstellenstatz and with the extension problem. We show how subadditivity can be used to derive an approximation theorem for (almost) plurisubharmonic functions: any such function can be approximated by a sequence of (almost) plurisubharmonic functions which are smooth outside an analytic set, and which define the same multiplier ideal sheaves. From this, we derive a generalized version of the hard Lefschetz theorem for cohomology with values in a pseudo-effective line bundle; namely, the Lefschetz map is surjective when the cohomology groups are twisted by the relevant multiplier ideal sheaves. These notes are essentially written with the idea of serving as an analytic tool- box for algebraic geometers. Although efficient algebraic techniques exist, our feeling is that the analytic techniques are very flexible and offer a large variety of guidelines for more algebraic questions (including applications to number theory which are not discussed here). We made a special effort to use as little prerequisites and to be as self-contained as possible; hence the rather long preliminary sections dealing with basic facts of complex differential geometry.

  15. Strategic engineering for cloud computing and big data analytics

    CERN Document Server

    Ramachandran, Muthu; Sarwar, Dilshad

    2017-01-01

    This book demonstrates the use of a wide range of strategic engineering concepts, theories and applied case studies to improve the safety, security and sustainability of complex and large-scale engineering and computer systems. It first details the concepts of system design, life cycle, impact assessment and security to show how these ideas can be brought to bear on the modeling, analysis and design of information systems with a focused view on cloud-computing systems and big data analytics. This informative book is a valuable resource for graduate students, researchers and industry-based practitioners working in engineering, information and business systems as well as strategy. .

  16. Endothelial cell cultures as a tool in biomaterial research

    NARCIS (Netherlands)

    Kirkpatrick, CJ; Otto, M; van Kooten, T; Krump, [No Value; Kriegsmann, J; Bittinger, F

    1999-01-01

    Progress in biocompatibility and tissue engineering would today be inconceivable without the aid of in vitro techniques. Endothelial cell cultures represent a valuable tool not just in haemocompatibility testing, but also in the concept of designing hybrid organs. In the past endothelial cells (EC)

  17. Tools for Educational Data Mining: A Review

    Science.gov (United States)

    Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan

    2017-01-01

    In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…

  18. Spectroelectrochemistry: A valuable tool for the study of organometallic-alkyne, -vinylidene, -cumulene, -alkynyl and related complexes

    International Nuclear Information System (INIS)

    Low, Paul J.; Bock, Sören

    2013-01-01

    This review presents a highly selective summary of spectroelectrochemical methods used in the study of metal alkyne, acetylide, vinylidene and allenylidene complexes. The review is illustrated predominantly by the selected examples from the authors’ group that formed the basis of a lecture at the recent ISE Annual Meeting. Emphasis is placed on the use of spectroelectrochemical methods to study redox-induced ligand isomerisation reactions, and determination of molecular electronic structure, which complement the conventional tools available to the synthetic chemist for characterisation of molecular compounds. The role of computational studies in supporting the interpretation of spectroscopic data is also briefly discussed

  19. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    Directory of Open Access Journals (Sweden)

    Sund Björn

    2013-02-01

    Full Text Available Abstract Background Out-of-hospital cardiac arrest (OHCA is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1 static changes such as trimming the emergency call handling time or (2 dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  20. Tools for studying dry-cured ham processing by using computed tomography.

    Science.gov (United States)

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  1. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London.

    Science.gov (United States)

    Lennox, Laura; Doyle, Cathal; Reed, Julie E; Bell, Derek

    2017-09-24

    Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). CLAHRC NWL improvement initiative teams and staff. The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with

  2. Anisotropic Multishell Analytical Modeling of an Intervertebral Disk Subjected to Axial Compression.

    Science.gov (United States)

    Demers, Sébastien; Nadeau, Sylvie; Bouzid, Abdel-Hakim

    2016-04-01

    Studies on intervertebral disk (IVD) response to various loads and postures are essential to understand disk's mechanical functions and to suggest preventive and corrective actions in the workplace. The experimental and finite-element (FE) approaches are well-suited for these studies, but validating their findings is difficult, partly due to the lack of alternative methods. Analytical modeling could allow methodological triangulation and help validation of FE models. This paper presents an analytical method based on thin-shell, beam-on-elastic-foundation and composite materials theories to evaluate the stresses in the anulus fibrosus (AF) of an axisymmetric disk composed of multiple thin lamellae. Large deformations of the soft tissues are accounted for using an iterative method and the anisotropic material properties are derived from a published biaxial experiment. The results are compared to those obtained by FE modeling. The results demonstrate the capability of the analytical model to evaluate the stresses at any location of the simplified AF. It also demonstrates that anisotropy reduces stresses in the lamellae. This novel model is a preliminary step in developing valuable analytical models of IVDs, and represents a distinctive groundwork that is able to sustain future refinements. This paper suggests important features that may be included to improve model realism.

  3. Mastering Search Analytics Measuring SEO, SEM and Site Search

    CERN Document Server

    Chaters, Brent

    2011-01-01

    Many companies still approach Search Engine Optimization (SEO) and paid search as separate initiatives. This in-depth guide shows you how to use these programs as part of a comprehensive strategy-not just to improve your site's search rankings, but to attract the right people and increase your conversion rate. Learn how to measure, test, analyze, and interpret all of your search data with a wide array of analytic tools. Gain the knowledge you need to determine the strategy's return on investment. Ideal for search specialists, webmasters, and search marketing managers, Mastering Search Analyt

  4. SM4AM: A Semantic Metamodel for Analytical Metadata

    DEFF Research Database (Denmark)

    Varga, Jovan; Romero, Oscar; Pedersen, Torben Bach

    2014-01-01

    Next generation BI systems emerge as platforms where traditional BI tools meet semi-structured and unstructured data coming from the Web. In these settings, the user-centric orientation represents a key characteristic for the acceptance and wide usage by numerous and diverse end users in their data....... We present SM4AM, a Semantic Metamodel for Analytical Metadata created as an RDF formalization of the Analytical Metadata artifacts needed for user assistance exploitation purposes in next generation BI systems. We consider the Linked Data initiative and its relevance for user assistance...

  5. Data science and big data analytics discovering, analyzing, visualizing and presenting data

    CERN Document Server

    2014-01-01

    Data Science and Big Data Analytics is about harnessing the power of data for new insights. The book covers the breadth of activities and methods and tools that Data Scientists use. The content focuses on concepts, principles and practical applications that are applicable to any industry and technology environment, and the learning is supported and explained with examples that you can replicate using open-source software. This book will help you: Become a contributor on a data science teamDeploy a structured lifecycle approach to data analytics problemsApply appropriate analytic techniques and

  6. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    Science.gov (United States)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  7. Advanced Dynamics Analytical and Numerical Calculations with MATLAB

    CERN Document Server

    Marghitu, Dan B

    2012-01-01

    Advanced Dynamics: Analytical and Numerical Calculations with MATLAB provides a thorough, rigorous presentation of kinematics and dynamics while using MATLAB as an integrated tool to solve problems. Topics presented are explained thoroughly and directly, allowing fundamental principles to emerge through applications from areas such as multibody systems, robotics, spacecraft and design of complex mechanical devices. This book differs from others in that it uses symbolic MATLAB for both theory and applications. Special attention is given to solutions that are solved analytically and numerically using MATLAB. The illustrations and figures generated with MATLAB reinforce visual learning while an abundance of examples offer additional support. This book also: Provides solutions analytically and numerically using MATLAB Illustrations and graphs generated with MATLAB reinforce visual learning for students as they study Covers modern technical advancements in areas like multibody systems, robotics, spacecraft and des...

  8. Novel extractants with high selectivity for valuable metals in seawater. Calixarene derivatives

    International Nuclear Information System (INIS)

    Kakoi, Takahiko; Goto, Masahiro

    1997-01-01

    Seawater contains various valuable metals such as uranium and lithium. Therefore, attempts are being made to develop highly selective extractants which recognize target metal ions in reclaimed seawater. In this review, we have focused our study on the application of novel cyclic compound calixarene based extractants. A novel host compound calixarene, which is a cyclic compound connecting some phenol rings, is capable of forming several different extractant ring sizes and introducing various kinds of functional groups towards targeting of metal ions in seawater. Therefore, calixarene derivatives are capable of extracting valuable metals such as uranium, alkaline metals, heavy metals, rare earth metals and noble metals selectively by varying structural ring size and functional groups. The novel host compound calixarene has given promising results which line it up as a potential extractant for the separation of valuable metal ions in seawater. (author)

  9. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  10. Non-commutative tools for topological insulators

    International Nuclear Information System (INIS)

    Prodan, Emil

    2010-01-01

    This paper reviews several analytic tools for the field of topological insulators, developed with the aid of non-commutative calculus and geometry. The set of tools includes bulk topological invariants defined directly in the thermodynamic limit and in the presence of disorder, whose robustness is shown to have nontrivial physical consequences for the bulk states. The set of tools also includes a general relation between the current of an observable and its edge index, a relation that can be used to investigate the robustness of the edge states against disorder. The paper focuses on the motivations behind creating such tools and on how to use them.

  11. Two analytical models for evaluating performance of Gigabit Ethernet Hosts

    International Nuclear Information System (INIS)

    Salah, K.

    2006-01-01

    Two analytical models are developed to study the impact of interrupt overhead on operating system performance of network hosts when subjected to Gigabit network traffic. Under heavy network traffic, the system performance will be negatively affected due to interrupt overhead caused by incoming traffic. In particular, excessive latency and significant degradation in system throughput can be experienced. Also user application may livelock as the CPU power is mostly consumed by interrupt handling and protocol processing. In this paper we present and compare two analytical models that capture host behavior and evaluate its performance. The first model is based Markov processes and queuing theory, while the second, which is more accurate but more complex is a pure Markov process. For the most part both models give mathematically-equivalent closed-form solutions for a number of important system performance metrics. These metrics include throughput, latency and stability condition, CPU utilization of interrupt handling and protocol processing and CPU availability for user applications. The analysis yields insight into understanding and predicting the impact of system and network choices on the performance of interrupt-driven systems when subjected to light and heavy network loads. More, importantly, our analytical work can also be valuable in improving host performance. The paper gives guidelines and recommendations to address design and implementation issues. Simulation and reported experimental results show that our analytical models are valid and give a good approximation. (author)

  12. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    Science.gov (United States)

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  13. Healthcare predictive analytics: An overview with a focus on Saudi Arabia.

    Science.gov (United States)

    Alharthi, Hana

    2018-03-08

    Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  14. Kidney Paired Donation and the "Valuable Consideration" Problem: The Experiences of Australia, Canada, and the United States.

    Science.gov (United States)

    Toews, Maeghan; Giancaspro, Mark; Richards, Bernadette; Ferrari, Paolo

    2017-09-01

    As organ donation rates remain unable to meet the needs of individuals waiting for transplants, it is necessary to identify reasons for this shortage and develop solutions to address it. The introduction of kidney paired donation (KPD) programs represents one such innovation that has become a valuable tool in donation systems around the world. Although KPD has been successful in increasing kidney donation and transplantation, there are lingering questions about its legality. Donation through KPD is done in exchange for-and with the expectation of-a reciprocal kidney donation and transplantation. It is this reciprocity that has caused concern about whether KPD complies with existing law. Organ donation systems around the world are almost universally structured to legally prohibit the commercial exchange of organs. Australia, Canada, and the United States have accomplished this goal by prohibiting the exchange of an organ for "valuable consideration," which is a legal term that has not historically been limited to monetary exchange. Whether or not KPD programs violate this legislative prohibition will depend on the specific legislative provision being considered, and the legal system and case law of the particular jurisdiction in question. This article compares the experiences of Australia, Canada, and the United States in determining the legality of KPD and highlights the need for legal clarity and flexibility as donation and transplantation systems continue to evolve.

  15. Moss Biomonitoring as a Tool for Radiological Exposure Assessment

    International Nuclear Information System (INIS)

    Barisic, D.; Vekic, B.; Kusan, V.; Spiric, Z.; Frontasyeva, M.

    2013-01-01

    The purpose of this study is to provide an insight into the Atmospheric Deposition of Airborne Radionuclides in Croatia by using the Moss Biomonitoring Technique. Moss samples were collected during the summer of 2010, from 161 locations in Croatia evenly distributed across the entire country. Sampling was performed in accordance with the LRTAP Convention - ICP Vegetation protocol and sampling strategy of the European Programme on Biomonitoring of Heavy Metal Atmospheric Deposition. In addition to the comprehensive qualitative and quantitative chemical analyses of all samples collected determined by NAA, ICP-AES and AAS, 22 out of 161 moss samples were subjected to gamma-spectrometric analyses for assessing activity of the naturally occurring radionuclides. The activities of 40K, 232Th, 137Cs, 226Ra and 238U were determined by using a low background HPGe detector system coupled with an 8192-channel CANBERRA analyzer. The detector system was calibrated using gamma mixed standards supplied by Eckert and Ziegler (Analytics USA). Preliminary research results on the Atmospheric Deposition of Airborne Radionuclides in Croatia by using the Moss Biomonitoring Technique confirm that it may serve as a valuable tool for Radiological Exposure Assessment. This research has the potential for simple, accurate, reliable and affordable environmental radiation control.(author)

  16. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  17. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Marek Tobiszewski

    2015-06-01

    Full Text Available The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  18. Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.

    Science.gov (United States)

    Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek

    2015-06-12

    The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.

  19. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    Science.gov (United States)

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  20. Updates in metabolomics tools and resources: 2014-2015.

    Science.gov (United States)

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Analytic webs support the synthesis of ecological data sets.

    Science.gov (United States)

    Ellison, Aaron M; Osterweil, Leon J; Clarke, Lori; Hadley, Julian L; Wise, Alexander; Boose, Emery; Foster, David R; Hanson, Allen; Jensen, David; Kuzeja, Paul; Riseman, Edward; Schultz, Howard

    2006-06-01

    A wide variety of data sets produced by individual investigators are now synthesized to address ecological questions that span a range of spatial and temporal scales. It is important to facilitate such syntheses so that "consumers" of data sets can be confident that both input data sets and synthetic products are reliable. Necessary documentation to ensure the reliability and validation of data sets includes both familiar descriptive metadata and formal documentation of the scientific processes used (i.e., process metadata) to produce usable data sets from collections of raw data. Such documentation is complex and difficult to construct, so it is important to help "producers" create reliable data sets and to facilitate their creation of required metadata. We describe a formal representation, an "analytic web," that aids both producers and consumers of data sets by providing complete and precise definitions of scientific processes used to process raw and derived data sets. The formalisms used to define analytic webs are adaptations of those used in software engineering, and they provide a novel and effective support system for both the synthesis and the validation of ecological data sets. We illustrate the utility of an analytic web as an aid to producing synthetic data sets through a worked example: the synthesis of long-term measurements of whole-ecosystem carbon exchange. Analytic webs are also useful validation aids for consumers because they support the concurrent construction of a complete, Internet-accessible audit trail of the analytic processes used in the synthesis of the data sets. Finally we describe our early efforts to evaluate these ideas through the use of a prototype software tool, SciWalker. We indicate how this tool has been used to create analytic webs tailored to specific data-set synthesis and validation activities, and suggest extensions to it that will support additional forms of validation. The process metadata created by SciWalker is

  2. Exploring Higher Education Governance: Analytical Models and Heuristic Frameworks

    Directory of Open Access Journals (Sweden)

    Burhan FINDIKLI

    2017-08-01

    Full Text Available Governance in higher education, both at institutional and systemic levels, has experienced substantial changes within recent decades because of a range of world-historical processes such as massification, growth, globalization, marketization, public sector reforms, and the emergence of knowledge economy and society. These developments have made governance arrangements and decision-making processes in higher education more complex and multidimensional more than ever and forced scholars to build new analytical and heuristic tools and strategies to grasp the intricacy and diversity of higher education governance dynamics. This article provides a systematic discussion of how and through which tools prominent scholars of higher education have analyzed governance in this sector by examining certain heuristic frameworks and analytical models. Additionally, the article shows how social scientific analysis of governance in higher education has proceeded in a cumulative way with certain revisions and syntheses rather than radical conceptual and theoretical ruptures from Burton R. Clark’s seminal work to the present, revealing conceptual and empirical junctures between them.

  3. Analytical Chemistry in the Regulatory Science of Medical Devices.

    Science.gov (United States)

    Wang, Yi; Guan, Allan; Wickramasekara, Samanthi; Phillips, K Scott

    2018-06-12

    In the United States, regulatory science is the science of developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of all Food and Drug Administration-regulated products. Good regulatory science facilitates consumer access to innovative medical devices that are safe and effective throughout the Total Product Life Cycle (TPLC). Because the need to measure things is fundamental to the regulatory science of medical devices, analytical chemistry plays an important role, contributing to medical device technology in two ways: It can be an integral part of an innovative medical device (e.g., diagnostic devices), and it can be used to support medical device development throughout the TPLC. In this review, we focus on analytical chemistry as a tool for the regulatory science of medical devices. We highlight recent progress in companion diagnostics, medical devices on chips for preclinical testing, mass spectrometry for postmarket monitoring, and detection/characterization of bacterial biofilm to prevent infections.

  4. Travel fosters tool use in wild chimpanzees.

    Science.gov (United States)

    Gruber, Thibaud; Zuberbühler, Klaus; Neumann, Christof

    2016-07-19

    Ecological variation influences the appearance and maintenance of tool use in animals, either due to necessity or opportunity, but little is known about the relative importance of these two factors. Here, we combined long-term behavioural data on feeding and travelling with six years of field experiments in a wild chimpanzee community. In the experiments, subjects engaged with natural logs, which contained energetically valuable honey that was only accessible through tool use. Engagement with the experiment was highest after periods of low fruit availability involving more travel between food patches, while instances of actual tool-using were significantly influenced by prior travel effort only. Additionally, combining data from the main chimpanzee study communities across Africa supported this result, insofar as groups with larger travel efforts had larger tool repertoires. Travel thus appears to foster tool use in wild chimpanzees and may also have been a driving force in early hominin technological evolution.

  5. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  6. Description of JNC's analytical method and its performance for FBR cores

    International Nuclear Information System (INIS)

    Ishikawa, M.

    2000-01-01

    The description of JNC's analytical method and its performance for FBR cores includes: an outline of JNC's Analytical System Compared with ERANOS; a standard data base for FBR Nuclear Design in JNC; JUPITER Critical Experiment; details of Analytical Method and Its Effects on JUPITER; performance of JNC Analytical System (effective multiplication factor k eff , control rod worth, and sodium void reactivity); design accuracy of a 600 MWe-class FBR Core. JNC developed a consistent analytical system for FBR core evaluation, based on JENDL library, f-table method, and three dimensional diffusion/transport theory, which includes comprehensive sensitivity tools to improve the prediction accuracy of core parameters. JNC system was verified by analysis of JUPITER critical experiment, and other facilities. Its performance can be judged quite satisfactory for FBR-core design work, though there is room for further improvement, such as more detailed treatment of cross-section resonance regions

  7. Five-axis Control Processing Using NC Machine Tools : A Tool Posture Decision Using the Tangent Slope at a Cut Point on a Work

    OpenAIRE

    小島, 龍広; 西田, 知照; 扇谷, 保彦

    2003-01-01

    This report deals with the way to decide tool posture and the way to analytically calculate tool path for the work shape requiring 5-axis control machining. In the tool path calculation, basic equations are derived using the principle that the tangent slope at a cut point on a work and the one at a cutting point on a tool edge are identical. A tool posture decision procedure using the tangent slope at each cut point on a work is proposed for any shape of tool edge. The valid- ity of the way t...

  8. Analytical Tools for Functional Assessment of Architectural Layouts

    Science.gov (United States)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these

  9. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    International Nuclear Information System (INIS)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A ampersand PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A ampersand PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A ampersand PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A ampersand PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in

  10. Line lessons: Enbridge's Northern Line provides valuable information

    Energy Technology Data Exchange (ETDEWEB)

    Ross, E.

    2000-02-01

    Experiences gained from the 14-year old Norman Wells crude oil pipeline in the Northwest Territories may provide operators with valuable insights in natural gas pipeline developments in northern Canada. The Norman Wells line is the first and only long-distance pipeline in North America buried in permafrost and has proven to be a veritable laboratory on pipeline behaviour in extremely cold climates which also happen to be discontinuous at the same time. The line was built by Enbridge with a 'limit state' design, i e. it was built to move within the permafrost within certain limits, the amount of movement depending upon the area in which the line was built. This technology, which is still cutting edge, allows the pipeline to react to the freeze-thaw cycle without being affected by the heaving and resettling. The knowledge gained from the Norman Wells Line has come in very useful in the more recent AltaGas Services project transporting natural gas from a nearby well into the the town of Inuvik. Enbridge also contributed to the development of various pipeline inspection tools such as the 'Geopig' which travels within the pipeline and can pinpoint the location of problems practically within a matter of inches, and the 'Rolligon' an amphibious vehicle with five-foot diameter rubber tires that displaces only two pounds per square inch, leaving barely a track as it travels along the right-of-way during times other than winter.

  11. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    Science.gov (United States)

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  12. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    Science.gov (United States)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  13. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  14. Manufacturing data analytics using a virtual factory representation.

    Science.gov (United States)

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  15. A Unified Channel Charges Expression for Analytic MOSFET Modeling

    Directory of Open Access Journals (Sweden)

    Hugues Murray

    2012-01-01

    Full Text Available Based on a 1D Poissons equation resolution, we present an analytic model of inversion charges allowing calculation of the drain current and transconductance in the Metal Oxide Semiconductor Field Effect Transistor. The drain current and transconductance are described by analytical functions including mobility corrections and short channel effects (CLM, DIBL. The comparison with the Pao-Sah integral shows excellent accuracy of the model in all inversion modes from strong to weak inversion in submicronics MOSFET. All calculations are encoded with a simple C program and give instantaneous results that provide an efficient tool for microelectronics users.

  16. Library improvement through data analytics

    CERN Document Server

    Farmer, Lesley S J

    2017-01-01

    This book shows how to act on and make sense of data in libraries. Using a range of techniques, tools and methodologies it explains how data can be used to help inform decision making at every level. Sound data analytics is the foundation for making an evidence-based case for libraries, in addition to guiding myriad organizational decisions, from optimizing operations for efficiency to responding to community needs. Designed to be useful for beginners as well as those with a background in data, this book introduces the basics of a six point framework that can be applied to a variety of library settings for effective system based, data-driven management. Library Improvement Through Data Analytics includes: - the basics of statistical concepts - recommended data sources for various library functions and processes, and guidance for using census, university, or - - government data in analysis - techniques for cleaning data - matching data to appropriate data analysis methods - how to make descriptive statistics m...

  17. 35th International Symposium on Environmental Analytical Chemistry - ISEAC 35. Book of Abstracts

    International Nuclear Information System (INIS)

    Namiestnik, J.; Gdaniec-Pietryka, M.; Klimaszewska, K.; Gorecka, A.; Sagajdakow, A.; Jakubowska, N.

    2008-01-01

    The ISEAC 35 is organized by the International Association of Environmental Analytical Chemistry (IAEAC), the Committee on Analytical Chemistry of the Polish Academy of Science (PAS), and the Chemical Faculty of Gdansk University of Technology (GUT). The Symposium includes a number of invited lectures treating frontier topics of environmental analytical chemistry, such as: (a) miniaturized spectroscopic tools for environmental survey analysis, (b) remote sensing in marine research, (c) xenobiotics in natural waters, (d) sampling and sample handling for environmental analysis. Book of Abstracts contains abstracts of 9 invited lectures, 62 oral presentations and 250 posters.

  18. Field Trips as Valuable Learning Experiences in Geography Courses

    Science.gov (United States)

    Krakowka, Amy Richmond

    2012-01-01

    Field trips have been acknowledged as valuable learning experiences in geography. This article uses Kolb's (1984) experiential learning model to discuss how students learn and how field trips can help enhance learning. Using Kolb's experiential learning theory as a guide in the design of field trips helps ensure that field trips contribute to…

  19. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  20. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    OpenAIRE

    Magdalena BORYS; Monika CZWÓRNÓG; Tomasz RATAJCZYK

    2016-01-01

    The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analy...

  1. Analytic Theology as Sapiential Theology: A Response to Jordan Wessling

    Directory of Open Access Journals (Sweden)

    Vanhoozer Kevin J.

    2017-10-01

    Full Text Available This article responds to Jordan Wessling’s paper that engages a concern I expressed about analytic theology not doing justice to the sapiential requirements of theology. I examine Wessling’s summary of my paper, conclude that his description is accurate and fair, appreciate his proposed solution, then go on to restate why I think he may not have fully allayed my concern. I suggest that analytic theology is a vital tool in the theologian’s toolkit, but that ultimately more is needed in order to interpret Scripture theologically.

  2. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    Science.gov (United States)

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  3. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    Science.gov (United States)

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  4. HC StratoMineR: A web-based tool for the rapid analysis of high content datasets

    NARCIS (Netherlands)

    Omta, W.; Heesbeen, R. van; Pagliero, R.; Velden, L. van der; Lelieveld, D.; Nellen, M.; Kramer, M.; Yeong, M.; Saeidi, A.; Medema, R.; Spruit, M.; Brinkkemper, S.; Klumperman, J.; Egan, D.

    2016-01-01

    High-content screening (HCS) can generate large multidimensional datasets and when aligned with the appropriate data mining tools, it can yield valuable insights into the mechanism of action of bioactive molecules. However, easy-to-use data mining tools are not widely available, with the result that

  5. HC StratoMineR : A Web-Based Tool for the Rapid Analysis of High-Content Datasets

    NARCIS (Netherlands)

    Omta, Wienand A; van Heesbeen, Roy G; Pagliero, Romina J; van der Velden, Lieke M; Lelieveld, Daphne; Nellen, Mehdi; Kramer, Maik; Yeong, Marley; Saeidi, Amir M; Medema, Rene H; Spruit, Marco; Brinkkemper, Sjaak; Klumperman, Judith; Egan, David A

    2016-01-01

    High-content screening (HCS) can generate large multidimensional datasets and when aligned with the appropriate data mining tools, it can yield valuable insights into the mechanism of action of bioactive molecules. However, easy-to-use data mining tools are not widely available, with the result that

  6. Air Traffic Management Cost Assessment Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The Robust Analytics Air Traffic Management Cost Assessment Tool (ACAT) provides the comprehensive capability to analyze the impacts of NASA air traffic management...

  7. A Modeling approach for analysis and improvement of spindle-holder-tool assembly dynamics

    OpenAIRE

    Budak, Erhan; Ertürk, A.; Erturk, A.; Özgüven, H. N.; Ozguven, H. N.

    2006-01-01

    The most important information required for chatter stability analysis is the dynamics of the involved structures, i.e. the frequency response functions (FRFs) which are usually determined experimentally. In this study, the tool point FRF of a spindle-holder-tool assembly is analytically determined by using the receptance coupling and structural modification techniques. Timoshenko’s beam model is used for increased accuracy. The spindle is also modeled analytically with elastic supports repre...

  8. Using Learning Analytics to Understand the Design of an Intelligent Language Tutor – Chatbot Lucy

    OpenAIRE

    Yi Fei Wang; Stephen Petrina

    2013-01-01

    the goal of this article is to explore how learning analytics can be used to predict and advise the design of an intelligent language tutor, chatbot Lucy. With its focus on using student-produced data to understand the design of Lucy to assist English language learning, this research can be a valuable component for language-learning designers to improve second language acquisition. In this article, we present students’ learning journey and data trails, the chatting log architecture and result...

  9. The Application of State-of-the-Art Analytic Tools (Biosensors and Spectroscopy in Beverage and Food Fermentation Process Monitoring

    Directory of Open Access Journals (Sweden)

    Shaneel Chandra

    2017-09-01

    Full Text Available The production of several agricultural products and foods are linked with fermentation. Traditional methods used to control and monitor the quality of the products and processes are based on the use of simple chemical analysis. However, these methods are time-consuming and do not provide sufficient relevant information to guarantee the chemical changes during the process. Commonly used methods applied in the agriculture and food industries to monitor fermentation are those based on simple or single-point sensors, where only one parameter is measured (e.g., temperature or density. These sensors are used several times per day and are often the only source of data available from which the conditions and rate of fermentation are monitored. In the modern food industry, an ideal method to control and monitor the fermentation process should enable a direct, rapid, precise, and accurate determination of several target compounds, with minimal to no sample preparation or reagent consumption. Here, state-of-the-art advancements in both the application of sensors and analytical tools to monitor beverage and food fermentation processes will be discussed.

  10. Substrate topography: A valuable in vitro tool, but a clinical red herring for in vivo tenogenesis.

    Science.gov (United States)

    English, Andrew; Azeem, Ayesha; Spanoudes, Kyriakos; Jones, Eleanor; Tripathi, Bhawana; Basu, Nandita; McNamara, Karrina; Tofail, Syed A M; Rooney, Niall; Riley, Graham; O'Riordan, Alan; Cross, Graham; Hutmacher, Dietmar; Biggs, Manus; Pandit, Abhay; Zeugolis, Dimitrios I

    2015-11-01

    Controlling the cell-substrate interactions at the bio-interface is becoming an inherent element in the design of implantable devices. Modulation of cellular adhesion in vitro, through topographical cues, is a well-documented process that offers control over subsequent cellular functions. However, it is still unclear whether surface topography can be translated into a clinically functional response in vivo at the tissue/device interface. Herein, we demonstrated that anisotropic substrates with a groove depth of ∼317nm and ∼1988nm promoted human tenocyte alignment parallel to the underlying topography in vitro. However, the rigid poly(lactic-co-glycolic acid) substrates used in this study upregulated the expression of chondrogenic and osteogenic genes, indicating possible tenocyte trans-differentiation. Of significant importance is that none of the topographies assessed (∼37nm, ∼317nm and ∼1988nm groove depth) induced extracellular matrix orientation parallel to the substrate orientation in a rat patellar tendon model. These data indicate that two-dimensional imprinting technologies are useful tools for in vitro cell phenotype maintenance, rather than for organised neotissue formation in vivo, should multifactorial approaches that consider both surface topography and substrate rigidity be established. Herein, we ventured to assess the influence of parallel groves, ranging from nano- to micro-level, on tenocytes response in vitro and on host response using a tendon and a subcutaneous model. In vitro analysis indicates that anisotropically ordered micro-scale grooves, as opposed to nano-scale grooves, maintain physiological cell morphology. The rather rigid PLGA substrates appeared to induce trans-differentiation towards chondrogenic and/or steogenic lineage, as evidence by TILDA gene analysis. In vivo data in both tendon and subcutaneous models indicate that none of the substrates induced bidirectional host cell and tissue growth. Collective, these

  11. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  12. Application of high-resolution melting analysis for authenticity testing of valuable Dendrobium commercial products.

    Science.gov (United States)

    Dong, Xiaoman; Jiang, Chao; Yuan, Yuan; Peng, Daiyin; Luo, Yuqin; Zhao, Yuyang; Huang, Luqi

    2018-01-01

    The accurate identification of botanical origin in commercial products is important to ensure food authenticity and safety for consumers. The Dendrobium species have long been commercialised as functional food supplements and herbal medicines in Asia. Three valuable Dendrobium species, namely Dendrobium officinale, D. huoshanense and D. moniliforme, are often mutually adulterated in trade products in pursuit of higher profit. In this paper, a rapid and reliable semi-quantitative method for identifying the botanical origin of Dendrobium products in terminal markets was developed using high-resolution melting (HRM) analysis with specific primer pairs to target the trnL-F region. The HRM analysis method detected amounts of D. moniliforme adulterants as low as 1% in D. huoshanense or D. officinale products. The results have demonstrated that HRM analysis is a fast and effective tool for the differentiation of these Dendrobium species both for their authenticity as well as for the semi-quantitative determination of the purity of their processed products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  13. The Culture Audit: A Leadership Tool for Assessment and Strategic Planning in Diverse Schools and Colleges

    Science.gov (United States)

    Bustamante, Rebecca M.

    2006-01-01

    This module is designed to introduce educational leaders to an organizational assessment tool called a "culture audit." Literature on organizational cultural competence suggests that culture audits are a valuable tool for determining how well school policies, programs, and practices respond to the needs of diverse groups and prepare…

  14. Origins, Phytochemistry, Pharmacology, Analytical Methods and Safety of Cortex Moutan (Paeonia suffruticosa Andrew: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Zhiqiang Wang

    2017-06-01

    Full Text Available Cortex Moutan (CM, a well-known traditional Chinese medicine, is commonly used for treating various diseases in China and other eastern Asian countries. Recorded in Pharmacopeias of several countries, CM is now drawing increasing attention and under extensive studies in various fields. Phytochemical studies indicate that CM contains many valuable secondary metabolites, such as monoterpene glycosides and phenols. Ample evidence from pharmacological researches suggest that CM has a wide spectrum of activities, such as anti-inflammatory, anti-oxidant, anti-tumor, anti-diabetic, cardiovascular protective, neuroprotective, hepatoprotective effects. Moreover, various analytical methods were established for the quality evaluation and safety control of CM. This review synopsizes updated information concerning the origins, phytochemistry, pharmacology, analytical method and safety of CM, aiming to provide favorable references for modern CM research and application. In conclusion, continuing pharmacological investigations concerning CM should be conducted to unravel its pharmacological mechanisms. Further researches are necessary to obtain comprehensive and applicable analytical approach for quality evaluation and establish harmonized criteria of CM.

  15. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  16. ReGaTE: Registration of Galaxy Tools in Elixir

    DEFF Research Database (Denmark)

    Doppelt-Azeroual, Olivia; Mareuil, Fabien; Deveaud, Eric

    2017-01-01

    such popular environment is the Galaxy framework, with currently more than 80 publicly available Galaxy servers around the world. In the context of a generic registry for bioinformatics software, such as bio.tools, Galaxy instances constitute a major source of valuable content. Yet there has been, to date...... of their services while enriching the software discovery function that bio.tools provides for its users. The source code of ReGaTE is freely available on Github at https://github.com/C3BI-pasteur-fr/ReGaTE....

  17. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-01-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)][M. M. Block, Eur. Phys. J. C 68, 683 (2010)] allow us to write fully decoupled solutions for the singlet structure function F s (x,Q 2 ) and G(x,Q 2 ) as F s (x,Q 2 )=F s (F s0 (x 0 ),G 0 (x 0 )) and G(x,Q 2 )=G(F s0 (x 0 ),G 0 (x 0 )), where the x 0 are the Bjorken x values at Q 0 2 . Here F s and G are known functions--found using LO DGLAP splitting functions--of the initial boundary conditions F s0 (x)≡F s (x,Q 0 2 ) and G 0 (x)≡G(x,Q 0 2 ), i.e., the chosen starting functions at the virtuality Q 0 2 . For both G(x) and F s (x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy--a computational fractional precision of O(10 -9 ). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F s distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)], starting from their initial values at Q 0 2 =1 GeV 2 and 1.69 GeV 2 , respectively, using their choice of α s (Q 2 ). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and F s satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of the starting functions on the evolved gluon and singlet structure functions, as functions of both Q

  18. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  19. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  20. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    Science.gov (United States)

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products.

  1. ANALYTICAL EMPLOYMENT OF STABLE ISOTOPES OF CARBON, NITROGEN, OXYGEN AND HYDROGEN FOR FOOD AUTHENTICATION

    Directory of Open Access Journals (Sweden)

    E. Novelli

    2011-04-01

    Full Text Available Stable isotopes of carbon, nitrogen, oxygen and hydrogen were used for analytical purposes for the discrimination of the type of production (farming vs. fishing in the case of sea bass and for geographical origin in the case of milk. These results corroborate similar experimental evidences and confirm the potential of this analytical tool to support of food traceability.

  2. Evaluation of analytical performance based on partial order methodology.

    Science.gov (United States)

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  4. IBM SPSS modeler essentials effective techniques for building powerful data mining and predictive analytics solutions

    CERN Document Server

    McCormick, Keith; Wei, Bowen

    2017-01-01

    IBM SPSS Modeler allows quick, efficient predictive analytics and insight building from your data, and is a popularly used data mining tool. This book will guide you through the data mining process, and presents relevant statistical methods which are used to build predictive models and conduct other analytic tasks using IBM SPSS Modeler. From ...

  5. Semantic Interaction for Sensemaking: Inferring Analytical Reasoning for Model Steering.

    Science.gov (United States)

    Endert, A; Fiaux, P; North, C

    2012-12-01

    Visual analytic tools aim to support the cognitively demanding task of sensemaking. Their success often depends on the ability to leverage capabilities of mathematical models, visualization, and human intuition through flexible, usable, and expressive interactions. Spatially clustering data is one effective metaphor for users to explore similarity and relationships between information, adjusting the weighting of dimensions or characteristics of the dataset to observe the change in the spatial layout. Semantic interaction is an approach to user interaction in such spatializations that couples these parametric modifications of the clustering model with users' analytic operations on the data (e.g., direct document movement in the spatialization, highlighting text, search, etc.). In this paper, we present results of a user study exploring the ability of semantic interaction in a visual analytic prototype, ForceSPIRE, to support sensemaking. We found that semantic interaction captures the analytical reasoning of the user through keyword weighting, and aids the user in co-creating a spatialization based on the user's reasoning and intuition.

  6. New Analytical Monographs on TCM Herbal Drugs for Quality Proof.

    Science.gov (United States)

    Wagner, Hildebert; Bauer, Rudolf; Melchart, Dieter

    2016-01-01

    Regardless of specific national drug regulations there is an international consensus that all TCM drugs must meet stipulated high quality standards focusing on authentication, identification and chemical composition. In addition, safety of all TCM drugs prescribed by physicians has to be guaranteed. During the 25 years history of the TCM hospital Bad Kötzting, 171 TCM drugs underwent an analytical quality proof including thin layer as well as high pressure liquid chromatography. As from now mass spectroscopy will also be available as analytical tool. The findings are compiled and already published in three volumes of analytical monographs. One more volume will be published shortly, and a fifth volume is in preparation. The main issues of the analytical procedure in TCM drugs like authenticity, botanical nomenclature, variability of plant species and parts as well as processing are pointed out and possible ways to overcome them are sketched. © 2016 S. Karger GmbH, Freiburg.

  7. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    Karanassios, Vassili

    2004-01-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  8. A modeling tool to support decision making in future hydropower development in Chile

    Science.gov (United States)

    Vicuna, S.; Hermansen, C.; Cerda, J. P.; Olivares, M. A.; Gomez, T. I.; Toha, E.; Poblete, D.; Mao, L.; Falvey, M. J.; Pliscoff, P.; Melo, O.; Lacy, S.; Peredo, M.; Marquet, P. A.; Maturana, J.; Gironas, J. A.

    2017-12-01

    Modeling tools support planning by providing transparent means to assess the outcome of natural resources management alternatives within technical frameworks in the presence of conflicting objectives. Such tools, when employed to model different scenarios, complement discussion in a policy-making context. Examples of practical use of this type of tool exist, such as the Canadian public forest management, but are not common, especially in the context of developing countries. We present a tool to support the selection from a portfolio of potential future hydropower projects in Chile. This tool, developed by a large team of researchers under the guidance of the Chilean Energy Ministry, is especially relevant in the context of evident regionalism, skepticism and change in societal values in a country that has achieved a sustained growth alongside increased demands from society. The tool operates at a scale of a river reach, between 1-5 km long, on a domain that can be defined according to the scale needs of the related discussion, and its application can vary from river basins to regions or other spatial configurations that may be of interest. The tool addresses both available hydropower potential and the existence (inferred or observed) of other ecological, social, cultural and productive characteristics of the territory which are valuable to society, and provides a means to evaluate their interaction. The occurrence of each of these other valuable characteristics in the territory is measured by generating a presence-density score for each. Considering the level of constraint each characteristic imposes on hydropower development, they are weighted against each other and an aggregate score is computed. With this information, optimal trade-offs are computed between additional hydropower capacity and valuable local characteristics over the entire domain, using the classical knapsack 0-1 optimization algorithm. Various scenarios of different weightings and hydropower

  9. Microfabricated tools for manipulation and analysis of magnetic microcarriers

    International Nuclear Information System (INIS)

    Tondra, Mark; Popple, Anthony; Jander, Albrecht; Millen, Rachel L.; Pekas, Nikola; Porter, Marc D.

    2005-01-01

    Tools for manipulating and detecting magnetic microcarriers are being developed with microscale features. Microfabricated giant magnetoresistive (GMR) sensors and wires are used for detection, and for creating high local field gradients. Microfluidic structures are added to control flow, and positioning of samples and microcarriers. These tools are designed for work in analytical chemistry and biology

  10. MVT a most valuable theorem

    CERN Document Server

    Smorynski, Craig

    2017-01-01

    This book is about the rise and supposed fall of the mean value theorem. It discusses the evolution of the theorem and the concepts behind it, how the theorem relates to other fundamental results in calculus, and modern re-evaluations of its role in the standard calculus course. The mean value theorem is one of the central results of calculus. It was called “the fundamental theorem of the differential calculus” because of its power to provide simple and rigorous proofs of basic results encountered in a first-year course in calculus. In mathematical terms, the book is a thorough treatment of this theorem and some related results in the field; in historical terms, it is not a history of calculus or mathematics, but a case study in both. MVT: A Most Valuable Theorem is aimed at those who teach calculus, especially those setting out to do so for the first time. It is also accessible to anyone who has finished the first semester of the standard course in the subject and will be of interest to undergraduate mat...

  11. Tools for Observation: Art and the Scientific Process

    Science.gov (United States)

    Pettit, E. C.; Coryell-Martin, M.; Maisch, K.

    2015-12-01

    Art can support the scientific process during different phases of a scientific discovery. Art can help explain and extend the scientific concepts for the general public; in this way art is a powerful tool for communication. Art can aid the scientist in processing and interpreting the data towards an understanding of the concepts and processes; in this way art is powerful - if often subconscious - tool to inform the process of discovery. Less often acknowledged, art can help engage students and inspire scientists during the initial development of ideas, observations, and questions; in this way art is a powerful tool to develop scientific questions and hypotheses. When we use art as a tool for communication of scientific discoveries, it helps break down barriers and makes science concepts less intimidating and more accessible and understandable for the learner. Scientists themselves use artistic concepts and processes - directly or indirectly - to help deepen their understanding. Teachers are following suit by using art more to stimulate students' creative thinking and problem solving. We show the value of teaching students to use the artistic "way of seeing" to develop their skills in observation, questioning, and critical thinking. In this way, art can be a powerful tool to engage students (from elementary to graduate) in the beginning phase of a scientific discovery, which is catalyzed by inquiry and curiosity. Through qualitative assessment of the Girls on Ice program, we show that many of the specific techniques taught by art teachers are valuable for science students to develop their observation skills. In particular, the concepts of contour drawing, squinting, gesture drawing, inverted drawing, and others can provide valuable training for student scientists. These art techniques encourage students to let go of preconceptions and "see" the world (the "data") in new ways they help students focus on both large-scale patterns and small-scale details.

  12. Interactions between plant growth and soil nutrient cycling under elevated CO2: a meta-analysis

    NARCIS (Netherlands)

    Graaff, de M.A.; Groenigen, van K.J.; Six, J.; Hungate, B.; Kessel, van C.

    2006-01-01

    free air carbon dioxide enrichment (FACE) and open top chamber (OTC) studies are valuable tools for evaluating the impact of elevated atmospheric CO2 on nutrient cycling in terrestrial ecosystems. Using meta-analytic techniques, we summarized the results of 117 studies on plant biomass production,

  13. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    Science.gov (United States)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify

  14. Mass spectrometry as a quantitative tool in plant metabolomics

    Science.gov (United States)

    Jorge, Tiago F.; Mata, Ana T.

    2016-01-01

    Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967

  15. An educational tool for interactive parallel and distributed processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2012-01-01

    In this article we try to describe how the modular interactive tiles system (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing a handson educational tool that allows a change in the representation...... of abstract problems related to designing interactive parallel and distributed systems. Indeed, the MITS seems to bring a series of goals into education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, and user and multi-user interaction which can rarely be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples we show how to implement...

  16. Per-operative vibration analysis: a valuable tool for defining correct stem insertion: preliminary report.

    Science.gov (United States)

    Mulier, Michiel; Pastrav, Cesar; Van der Perre, Georges

    2008-01-01

    Defining the stem insertion end point during total hip replacement still relies on the surgeon's feeling. When a custom-made stem prosthesis with an optimal fit into the femoral canal is used, the risk of per-operative fractures is even greater than with standard prostheses. Vibration analysis is used in other clinical settings and has been tested as a means to detect optimal stem insertion in the laboratory. The first per-operative use of vibration analysis during non-cemented custom-made stem insertion in 30 patients is reported here. Thirty patients eligible for total hip replacement with uncemented stem prosthesis were included. The neck of the stem was connected with a shaker that emitted white noise as excitation signal and an impedance head that measured the frequency response. The response signal was sent to a computer that analyzed the frequency response function after each insertion phase. A technician present in the operating theatre but outside the laminated airflow provided feed-back to the surgeon. The correlation index between the frequency response function measured during the last two insertion hammering sessions was >0.99 in 86.7% of the cases. In four cases the surgeon stopped the insertion procedure because of a perceived risk of fracture. Two special cases illustrating the potential benefit of per-operative vibration analysis are described. The results of intra-operative vibration analysis indicate that this technique may be a useful tool assisting the orthopaedic surgeon in defining the insertion endpoint of the stem. The development of a more user-friendly device is therefore warranted.

  17. λ5-Phosphorus-Containing α-Diazo Compounds: A Valuable Tool for Accessing Phosphorus-Functionalized Molecules.

    Science.gov (United States)

    Marinozzi, Maura; Pertusati, Fabrizio; Serpi, Michaela

    2016-11-23

    The compounds characterized by the presence of a λ 5 -phosphorus functionality at the α-position with respect to the diazo moiety, here referred to as λ 5 -phosphorus-containing α-diazo compounds (PCDCs), represent a vast class of extremely versatile reagents in organic chemistry and are particularly useful in the preparation of phosphonate- and phosphinoxide-functionalized molecules. Indeed, thanks to the high reactivity of the diazo moiety, PCDCs can be induced to undergo a wide variety of chemical transformations. Among them are carbon-hydrogen, as well as heteroatom-hydrogen insertion reactions, cyclopropanation, ylide formation, Wolff rearrangement, and cycloaddition reactions. PCDCs can be easily prepared from readily accessible precursors by a variety of different methods, such as diazotization, Bamford-Stevens-type elimination, and diazo transfer reactions. This evidence along with their relative stability and manageability make them appealing tools in organic synthesis. This Review aims to demonstrate the ongoing utility of PCDCs in the modern preparation of different classes of phosphorus-containing compounds, phosphonates, in particular. Furthermore, to address the lack of precedent collective papers, this Review also summarizes the methods for PCDCs preparation.

  18. MALDI-TOF mass spectrometry following short incubation on a solid medium is a valuable tool for rapid pathogen identification from positive blood cultures.

    Science.gov (United States)

    Kohlmann, Rebekka; Hoffmann, Alexander; Geis, Gabriele; Gatermann, Sören

    2015-01-01

    approach allowed an optimized treatment recommendation. MALDI-TOF MS following 4h pre-culture is a valuable tool for rapid pathogen identification from positive blood cultures, allowing easy integration in diagnostic routine and the opportunity of considerably earlier treatment adaptation. Copyright © 2015 Elsevier GmbH. All rights reserved.

  19. Salt Lakes of the African Rift System: A Valuable Research ...

    African Journals Online (AJOL)

    Salt Lakes of the African Rift System: A Valuable Research Opportunity for Insight into Nature's Concenrtated Multi-Electrolyte Science. JYN Philip, DMS Mosha. Abstract. The Tanzanian rift system salt lakes present significant cultural, ecological, recreational and economical values. Beyond the wealth of minerals, resources ...

  20. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    Science.gov (United States)

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  1. CorRECTreatment: a web-based decision support tool for rectal cancer treatment that uses the analytic hierarchy process and decision tree.

    Science.gov (United States)

    Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C

    2015-01-01

    The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratiodecisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.

  2. Proactive Supply Chain Performance Management with Predictive Analytics

    Directory of Open Access Journals (Sweden)

    Nenad Stefanovic

    2014-01-01

    Full Text Available Today’s business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators. Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  3. Proactive supply chain performance management with predictive analytics.

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.

  4. Proactive Supply Chain Performance Management with Predictive Analytics

    Science.gov (United States)

    Stefanovic, Nenad

    2014-01-01

    Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605

  5. A survey of open source tools for business intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2005-01-01

    The industrial use of open source Business Intelligence (BI) tools is not yet common. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI....... In the paper, we consider three Extract-Transform-Load (ETL) tools, three On-Line Analytical Processing (OLAP) servers, two OLAP clients, and four database management systems (DBMSs). Further, we describe the licenses that the products are released under. It is argued that the ETL tools are still not very...

  6. Cross learning synergies between Operation Management content and the use of generic analytic tools

    Directory of Open Access Journals (Sweden)

    Frederic Marimon

    2017-06-01

    By presenting both objectives simultaneously students are found to be more motivated towards working deeply in both objectives. Students know that the theoretical content will be put in practice through certain tools, strengthening the student's interest on the conceptual issues of the chapter. In turn, because students know that they will use a generic tool in a known context, their interests in these tools is reinforced. The result is a cross learning synergy.

  7. Perspectives on making big data analytics work for oncology.

    Science.gov (United States)

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from

  8. Impact of population and economic growth on carbon emissions in Taiwan using an analytic tool STIRPAT

    Directory of Open Access Journals (Sweden)

    Jong-Chao Yeh

    2017-01-01

    Full Text Available Carbon emission has increasingly become an issue of global concern because of climate change. Unfortunately, Taiwan was listed as top 20 countries of carbon emission in 2014. In order to provide appropriate measures to control carbon emission, it appears that there is an urgent need to address how such factors as population and economic growth impact the emission of carbon dioxide in any developing countries. In addition to total population, both the percentages of population living in urban area (i.e., urbanization percentage, and non-dependent population may also serve as limiting factors. On the other hand, the total energy-driven gross domestic production (GDP and the percentage of GDP generated by the manufacturing industries are assessed to see their respective degree of impact on carbon emission. Therefore, based on the past national data in the period 1990–2014 in Taiwan, an analytic tool of Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT was employed to see how well those aforementioned factors can describe their individual potential impact on global warming, which is measured by the total amount of carbon emission into the atmosphere. Seven scenarios of STIRPAT model were proposed and tested statistically for the significance of each proposed model. As a result, two models were suggested to predict the impact of carbon emission due to population and economic growth by the year 2025 in Taiwan.

  9. Web-based Visual Analytics for Extreme Scale Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Evans, Katherine J [ORNL; Harney, John F [ORNL; Jewell, Brian C [ORNL; Shipman, Galen M [ORNL; Smith, Brian E [ORNL; Thornton, Peter E [ORNL; Williams, Dean N. [Lawrence Livermore National Laboratory (LLNL)

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  10. Room temperature phosphorescence in the liquid state as a tool in analytical chemistry

    International Nuclear Information System (INIS)

    Kuijt, Jacobus; Ariese, Freek; Brinkman, Udo A.Th.; Gooijer, Cees

    2003-01-01

    A wide-ranging overview of room temperature phosphorescence in the liquid state (RTPL ) is presented, with a focus on recent developments. RTPL techniques like micelle-stabilized (MS)-RTP, cyclodextrin-induced (CD)-RTP, and heavy atom-induced (HAI)-RTP are discussed. These techniques are mainly applied in the stand-alone format, but coupling with some separation techniques appears to be feasible. Applications of direct, sensitized and quenched phosphorescence are also discussed. As regards sensitized and quenched RTP, emphasis is on the coupling with liquid chromatography (LC) and capillary electrophoresis (CE), but stand-alone applications are also reported. Further, the application of RTPL in immunoassays and in RTP optosensing - the optical sensing of analytes based on RTP - is reviewed. Next to the application of RTPL in quantitative analysis, its use for the structural probing of protein conformations and for time-resolved microscopy of labelled biomolecules is discussed. Finally, an overview is presented of the various analytical techniques which are based on the closely related phenomenon of long-lived lanthanide luminescence. The paper closes with a short evaluation of the state-of-the-art in RTP and a discussion on future perspectives

  11. The production of audiovisual teaching tools in minimally invasive surgery.

    Science.gov (United States)

    Tolerton, Sarah K; Hugh, Thomas J; Cosman, Peter H

    2012-01-01

    Audiovisual learning resources have become valuable adjuncts to formal teaching in surgical training. This report discusses the process and challenges of preparing an audiovisual teaching tool for laparoscopic cholecystectomy. The relative value in surgical education and training, for both the creator and viewer are addressed. This audiovisual teaching resource was prepared as part of the Master of Surgery program at the University of Sydney, Australia. The different methods of video production used to create operative teaching tools are discussed. Collating and editing material for an audiovisual teaching resource can be a time-consuming and technically challenging process. However, quality learning resources can now be produced even with limited prior video editing experience. With minimal cost and suitable guidance to ensure clinically relevant content, most surgeons should be able to produce short, high-quality education videos of both open and minimally invasive surgery. Despite the challenges faced during production of audiovisual teaching tools, these resources are now relatively easy to produce using readily available software. These resources are particularly attractive to surgical trainees when real time operative footage is used. They serve as valuable adjuncts to formal teaching, particularly in the setting of minimally invasive surgery. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  12. A Reference Architecture for a Cloud-Based Tools as a Service Workspace

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali; Sheng, Quan Z.

    2015-01-01

    Software Architecture (SA) plays a critical role in developing and evolving cloud-based applications. We present a Reference Architecture (RA) for designing Cloud-based Tools as a service work SPACE (TSPACE) - a platform for provisioning chain of tools following the Software as a Service (SaaS...... evaluate the RA in terms of completeness and feasibility. Our proposed RA can provide valuable guidance and insights for designing and implementing concrete software architectures of TSPACE....

  13. Jet substructure with analytical methods

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, Mrinal [University of Manchester, Consortium for Fundamental Physics, School of Physics and Astronomy, Manchester (United Kingdom); Fregoso, Alessandro; Powling, Alexander [University of Manchester, School of Physics and Astronomy, Manchester (United Kingdom); Marzani, Simone [Durham University, Institute for Particle Physics Phenomenology, Durham (United Kingdom)

    2013-11-15

    We consider the mass distribution of QCD jets after the application of jet-substructure methods, specifically the mass-drop tagger, pruning, trimming and their variants. In contrast to most current studies employing Monte Carlo methods, we carry out analytical calculations at the next-to-leading order level, which are sufficient to extract the dominant logarithmic behaviour for each technique, and compare our findings to exact fixed-order results. Our results should ultimately lead to a better understanding of these jet-substructure methods which in turn will influence the development of future substructure tools for LHC phenomenology. (orig.)

  14. Electronic Corpora as Translation Tools

    DEFF Research Database (Denmark)

    Laursen, Anne Lise; Mousten, Birthe; Jensen, Vigdis

    2012-01-01

    translator who has to get a cross-linguistic overview of a new area or a new line of business. Relevant internet texts can be compiled ‘on the fly’, but internet data needs to be sorted and analyzed for rational use. Today, such sorting and analysis can be made by a low-tech, analytical software tool....... This article demonstrates how strategic steps of compiling and retrieving linguistic data by means of specific search strategies can be used to make electronic corpora an efficient tool in translators’ daily work with fields that involve new terminology, but where the skills requested to work correspond...

  15. Systemic Assessment as a New Tool for Assessing Students ...

    African Journals Online (AJOL)

    Systemic Assessment as a New Tool for Assessing Students Learning in Chemistry using SATL Methods: Systemic Matching, Systemic Synthesis, Systemic Analysis, Systemic Synthetic – Analytic, as Systemic Question Types.

  16. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    Science.gov (United States)

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  17. In Situ Scanning Probe Microscopy and New Perspectives in Analytical Chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Zhang, Jingdong; Chi, Qijin

    1999-01-01

    The resolution of scanning probe microscopies is unpresedented but the techniques are fraught with limitations as analytical tools. These limitations and their relationship to the physical mechanisms of image contrast are first discussed. Some new options based on in situ STM, which hold prospect...

  18. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    Science.gov (United States)

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  19. Dcode.org anthology of comparative genomic tools.

    Science.gov (United States)

    Loots, Gabriela G; Ovcharenko, Ivan

    2005-07-01

    Comparative genomics provides the means to demarcate functional regions in anonymous DNA sequences. The successful application of this method to identifying novel genes is currently shifting to deciphering the non-coding encryption of gene regulation across genomes. To facilitate the practical application of comparative sequence analysis to genetics and genomics, we have developed several analytical and visualization tools for the analysis of arbitrary sequences and whole genomes. These tools include two alignment tools, zPicture and Mulan; a phylogenetic shadowing tool, eShadow for identifying lineage- and species-specific functional elements; two evolutionary conserved transcription factor analysis tools, rVista and multiTF; a tool for extracting cis-regulatory modules governing the expression of co-regulated genes, Creme 2.0; and a dynamic portal to multiple vertebrate and invertebrate genome alignments, the ECR Browser. Here, we briefly describe each one of these tools and provide specific examples on their practical applications. All the tools are publicly available at the http://www.dcode.org/ website.

  20. Rethinking Visual Analytics for Streaming Data Applications

    Energy Technology Data Exchange (ETDEWEB)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    2017-01-01

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between the two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive

  1. Coordinated experimental/analytical program for investigating margins to failure of Category I reinforced concrete structures

    International Nuclear Information System (INIS)

    Endebrock, E.; Dove, R.; Anderson, C.A.

    1981-01-01

    The material presented in this paper deals with a coordinated experimental/analytical program designed to provide information needed for making margins to failure assessments of seismic Category I reinforced concrete structures. The experimental program is emphasized and background information that lead to this particular experimental approach is presented. Analytical tools being developed to supplement the experimental program are discussed. 16 figures

  2. Healthcare Data Analytics on the Cloud

    Directory of Open Access Journals (Sweden)

    Indrajit Bhattacharya

    2012-04-01

    Full Text Available Meaningful analysis of voluminous health information has always been a challenge in most healthcare organizations. Accurate and timely information required by the management to lead a healthcare organization through the challenges found in the industry can be obtained using business intelligence (BI or business analytics tools. However, these require large capital investments to implement and support the large volumes of data that needs to be analyzed to identify trends. They also require enormous processing power which places pressure on the business resources in addition to the dynamic changes in the digital technology. This paper evaluates the various nuances of business analytics of healthcare hosted on the cloud computing environment. The paper explores BI being offered as Software as a Service (SaaS solution towards offering meaningful use of information for improving functions in healthcare enterprise. It also attempts to identify the challenges that healthcare enterprises face when making use of a BI SaaS solution.

  3. ATLAS Analytics and Machine Learning Platforms

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Legger, Federica; Gardner, Robert

    2018-01-01

    In 2015 ATLAS Distributed Computing started to migrate its monitoring systems away from Oracle DB and decided to adopt new big data platforms that are open source, horizontally scalable, and offer the flexibility of NoSQL systems. Three years later, the full software stack is in place, the system is considered in production and operating at near maximum capacity (in terms of storage capacity and tightly coupled analysis capability). The new model provides several tools for fast and easy to deploy monitoring and accounting. The main advantages are: ample ways to do complex analytics studies (using technologies such as java, pig, spark, python, jupyter), flexibility in reorganization of data flows, near real time and inline processing. The analytics studies improve our understanding of different computing systems and their interplay, thus enabling whole-system debugging and optimization. In addition, the platform provides services to alarm or warn on anomalous conditions, and several services closing feedback l...

  4. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  5. Comprehension of complex biological processes by analytical methods: how far can we go using mass spectrometry?

    International Nuclear Information System (INIS)

    Gerner, C.

    2013-01-01

    Comprehensive understanding of complex biological processes is the basis for many biomedical issues of great relevance for modern society including risk assessment, drug development, quality control of industrial products and many more. Screening methods provide means for investigating biological samples without research hypothesis. However, the first boom of analytical screening efforts has passed and we again need to ask whether and how to apply screening methods. Mass spectrometry is a modern tool with unrivalled analytical capacities. This applies to all relevant characteristics of analytical methods such as specificity, sensitivity, accuracy, multiplicity and diversity of applications. Indeed, mass spectrometry qualifies to deal with complexity. Chronic inflammation is a common feature of almost all relevant diseases challenging our modern society; these diseases are apparently highly diverse and include arteriosclerosis, cancer, back pain, neurodegenerative diseases, depression and other. The complexity of mechanisms regulating chronic inflammation is the reason for the practical challenge to deal with it. The presentation shall give an overview of capabilities and limitations of the application of this analytical tool to solve critical questions with great relevance for our society. (author)

  6. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    Science.gov (United States)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  7. an assessment of timber trees producing valuable fruits and seeds ...

    African Journals Online (AJOL)

    User

    It is observed that most of the timber trees producing valuable fruits and seeds have low ... sector of the economy by providing major raw materials (saw logs, ... the trees also produce industrial raw materials like latex, ... villagers while avoiding some of the ecological costs of ..... enzymes of rats with carbon tetrachloride.

  8. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.

    1995-01-01

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  9. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  10. Trends in Process Analytical Technology: Present State in Bioprocessing.

    Science.gov (United States)

    Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian

    2017-08-04

    Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.

  11. From corruption to state capture: A new analytical framework with empirical applications from Hungary

    OpenAIRE

    Fazekas, Mihaly; Tóth, István János

    2016-01-01

    State capture and corruption are widespread phenomena across the globe, but their empirical study still lacks sufficient analytical tools. This paper develops a new conceptual and analytical framework for gauging state capture based on micro-level contractual networks in public procurement. To this end, it establishes a novel measure of corruption risk in government contracting focusing on the behaviour of individual organisations. Then, it identifies clusters of high corruption risk organisa...

  12. Analytical Methodology for the Determination of Radium Isotopes in Environmental Samples

    International Nuclear Information System (INIS)

    2010-01-01

    Reliable, comparable and 'fit for purpose' results are an essential requirement for any decision based on analytical measurements. For the analyst, the availability of tested and validated analytical procedures is an extremely important tool for production of such analytical measurements. For maximum utility, such procedures should be comprehensive, clearly formulated, and readily available to both the analyst and the customer for reference. Since 2004, the environment programme of the IAEA has included activities aimed at the development of a set of procedures for the determination of radionuclides in terrestrial environmental samples. Measurements of radium isotopes are important for radiological and environmental protection, geochemical and geochronological investigations, hydrology, etc. The suite of isotopes creates and stimulates continuing interest in the development of new methods for determination of radium in various media. In this publication, the four most routinely used analytical methods for radium determination in biological and environmental samples, i.e. alpha spectrometry, gamma spectrometry, liquid scintillation spectrometry and mass spectrometry, are reviewed

  13. The Solid* toolset for software visual analytics of program structure and metrics comprehension : From research prototype to product

    NARCIS (Netherlands)

    Reniers, Dennie; Voinea, Lucian; Ersoy, Ozan; Telea, Alexandru

    2014-01-01

    Software visual analytics (SVA) tools combine static program analysis and fact extraction with information visualization to support program comprehension. However, building efficient and effective SVA tools is highly challenging, as it involves extensive software development in program analysis,

  14. A comparison of galaxy group luminosity functions from semi-analytic models

    NARCIS (Netherlands)

    Snaith, Owain N.; Gibson, Brad K.; Brook, Chris B.; Courty, Stéphanie; Sánchez-Blázquez, Patricia; Kawata, Daisuke; Knebe, Alexander; Sales, Laura V.

    Semi-analytic models (SAMs) are currently one of the primary tools with which we model statistically significant ensembles of galaxies. The underlying physical prescriptions inherent to each SAM are, in many cases, different from one another. Several SAMs have been applied to the dark matter merger

  15. An analytical model for the heat generation in friction stir welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper; Wert, John

    2004-01-01

    The objective of this work is to establish an analytical model for heat generation by friction stir welding (FSW), based on different assumptions of the contact condition between the rotating tool surface and the weld piece. The material flow and heat generation are characterized by the contact...

  16. WEB ANALYTICS COMBINED WITH EYE TRACKING FOR SUCCESSFUL USER EXPERIENCE DESIGN: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Magdalena BORYS

    2016-12-01

    Full Text Available The authors propose a new approach for the mobile user experience design process by means of web analytics and eye-tracking. The proposed method was applied to design the LUT mobile website. In the method, to create the mobile website design, data of various users and their behaviour were gathered and analysed using the web analytics tool. Next, based on the findings from web analytics, the mobile prototype for the website was created and validated in eye-tracking usability testing. The analysis of participants’ behaviour during eye-tracking sessions allowed improvements of the prototype.

  17. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    Science.gov (United States)

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  18. Biofluid infrared spectro-diagnostics: pre-analytical considerations for clinical applications.

    Science.gov (United States)

    Lovergne, L; Bouzy, P; Untereiner, V; Garnotel, R; Baker, M J; Thiéfin, G; Sockalingum, G D

    2016-06-23

    Several proof-of-concept studies on the vibrational spectroscopy of biofluids have demonstrated that the methodology has promising potential as a clinical diagnostic tool. However, these studies also show that there is a lack of a standardised protocol in sample handling and preparation prior to spectroscopic analysis. One of the most important sources of analytical errors is the pre-analytical phase. For the technique to be translated into clinics, it is clear that a very strict protocol needs to be established for such biological samples. This study focuses on some of the aspects of the pre-analytical phase in the development of the high-throughput Fourier Transform Infrared (FTIR) spectroscopy of some of the most common biofluids such as serum, plasma and bile. Pre-analytical considerations that can impact either the samples (solvents, anti-coagulants, freeze-thaw cycles…) and/or spectroscopic analysis (sample preparation such as drying, deposit methods, volumes, substrates, operators dependence…) and consequently the quality and the reproducibility of spectral data will be discussed in this report.

  19. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  20. Primary culture of glial cells from mouse sympathetic cervical ganglion: a valuable tool for studying glial cell biology.

    Science.gov (United States)

    de Almeida-Leite, Camila Megale; Arantes, Rosa Maria Esteves

    2010-12-15

    Central nervous system glial cells as astrocytes and microglia have been investigated in vitro and many intracellular pathways have been clarified upon various stimuli. Peripheral glial cells, however, are not as deeply investigated in vitro despite its importance role in inflammatory and neurodegenerative diseases. Based on our previous experience of culturing neuronal cells, our objective was to standardize and morphologically characterize a primary culture of mouse superior cervical ganglion glial cells in order to obtain a useful tool to study peripheral glial cell biology. Superior cervical ganglia from neonatal C57BL6 mice were enzymatically and mechanically dissociated and cells were plated on diluted Matrigel coated wells in a final concentration of 10,000cells/well. Five to 8 days post plating, glial cell cultures were fixed for morphological and immunocytochemical characterization. Glial cells showed a flat and irregular shape, two or three long cytoplasm processes, and round, oval or long shaped nuclei, with regular outline. Cell proliferation and mitosis were detected both qualitative and quantitatively. Glial cells were able to maintain their phenotype in our culture model including immunoreactivity against glial cell marker GFAP. This is the first description of immunocytochemical characterization of mouse sympathetic cervical ganglion glial cells in primary culture. This work discusses the uses and limitations of our model as a tool to study many aspects of peripheral glial cell biology. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    Science.gov (United States)

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. EnergiTools. A methodology for performance monitoring and diagnosis

    International Nuclear Information System (INIS)

    Ancion, P.; Bastien, R.; Ringdahl, K.

    2000-01-01

    EnergiTools is a performance monitoring and diagnostic tool that combines the power of on-line process data acquisition with advanced diagnosis methodologies. Analytical models based on thermodynamic principles are combined with neural networks to validate sensor data and to estimate missing or faulty measurements. Advanced diagnostic technologies are then applied to point out potential faults and areas to be investigated further. The diagnosis methodologies are based on Bayesian belief networks. Expert knowledge is captured in the form of the fault-symptom relationships and includes historical information as the likelihood of faults and symptoms. The methodology produces the likelihood of component failure root causes using the expert knowledge base. EnergiTools is used at Ringhals nuclear power plants. It has led to the diagnosis of various performance issues. Three case studies based on this plant data and model are presented and illustrate the diagnosis support methodologies implemented in EnergiTools . In the first case, the analytical data qualification technique points out several faulty measurements. The application of a neural network for the estimation of the nuclear reactor power by interpreting several plant indicators is then illustrated. The use of the Bayesian belief networks is finally described. (author)

  3. Nonlinear Growth Curves in Developmental Research

    Science.gov (United States)

    Grimm, Kevin J.; Ram, Nilam; Hamagami, Fumiaki

    2011-01-01

    Developmentalists are often interested in understanding change processes, and growth models are the most common analytic tool for examining such processes. Nonlinear growth curves are especially valuable to developmentalists because the defining characteristics of the growth process such as initial levels, rates of change during growth spurts, and…

  4. Advantages of Integrative Data Analysis for Developmental Research

    Science.gov (United States)

    Bainter, Sierra A.; Curran, Patrick J.

    2015-01-01

    Amid recent progress in cognitive development research, high-quality data resources are accumulating, and data sharing and secondary data analysis are becoming increasingly valuable tools. Integrative data analysis (IDA) is an exciting analytical framework that can enhance secondary data analysis in powerful ways. IDA pools item-level data across…

  5. The analytic hierarchy process as a support for decision making

    Directory of Open Access Journals (Sweden)

    Filipović Milanka

    2007-01-01

    Full Text Available The first part of this text deals with a convention site selection as one of the most lucrative areas in the tourism industry. The second part gives a further description of a method for decision making - the analytic hierarchy process. The basic characteristics: hierarchy constructions and pair wise comparison on the given level of the hierarchy are allured. The third part offers an example of application. This example is solved using the Super - Decision software, which is developed as a computer support for the analytic hierarchy process. This indicates that the AHP approach is a useful tool to help support a decision of convention site selection. .

  6. A Visual Analytics Technique for Identifying Heat Spots in Transportation Networks

    Directory of Open Access Journals (Sweden)

    Marian Sorin Nistor

    2016-12-01

    Full Text Available The decision takers of the public transportation system, as part of urban critical infrastructures, need to increase the system resilience. For doing so, we identified analysis tools for biological networks as an adequate basis for visual analytics in that domain. In the paper at hand we therefore translate such methods for transportation systems and show the benefits by applying them on the Munich subway network. Here, visual analytics is used to identify vulnerable stations from different perspectives. The applied technique is presented step by step. Furthermore, the key challenges in applying this technique on transportation systems are identified. Finally, we propose the implementation of the presented features in a management cockpit to integrate the visual analytics mantra for an adequate decision support on transportation systems.

  7. About new software and hardware tools in the education of 'Semiconductor Devices'

    International Nuclear Information System (INIS)

    Taneva, Ljudmila; Basheva, Bistra

    2009-01-01

    This paper describes the new tools, used in the education of ”Semiconductor Devices”, developed at the Technological School “Electronic Systems”, Department of the Technical University, Sofia. The software and hardware tools give the opportunity to achieve the right balance between theory and practice, and the students are given the chance to accumulate valuable “hands-on” skills. The main purpose of the developed lab exercises is to demonstrate the use of some electronic components and practice with them. Keywords: semiconductors, media software tool, hardware, education

  8. Pellet manufacturing by extrusion-spheronization using process analytical technology

    DEFF Research Database (Denmark)

    Sandler, Niklas; Rantanen, Jukka; Heinämäki, Jyrki

    2005-01-01

    The aim of this study was to investigate the phase transitions occurring in nitrofurantoin and theophylline formulations during pelletization by extrusion-spheronization. An at-line process analytical technology (PAT) approach was used to increase the understanding of the solid-state behavior...... of the active pharmaceutical ingredients (APIs) during pelletization. Raman spectroscopy, near-infrared (NIR) spectroscopy, and X-ray powder diffraction (XRPD) were used in the characterization of polymorphic changes during the process. Samples were collected at the end of each processing stage (blending......, granulation, extrusion, spheronization, and drying). Batches were dried at 3 temperature levels (60 degrees C, 100 degrees C, and 135 degrees C). Water induced a hydrate formation in both model formulations during processing. NIR spectroscopy gave valuable real-time data about the state of water in the system...

  9. Analytical and finite element modeling of grounding systems

    Energy Technology Data Exchange (ETDEWEB)

    Luz, Mauricio Valencia Ferreira da [University of Santa Catarina (UFSC), Florianopolis, SC (Brazil)], E-mail: mauricio@grucad.ufsc.br; Dular, Patrick [University of Liege (Belgium). Institut Montefiore], E-mail: Patrick.Dular@ulg.ac.be

    2007-07-01

    Grounding is the art of making an electrical connection to the earth. This paper deals with the analytical and finite element modeling of grounding systems. An electrokinetic formulation using a scalar potential can benefit from floating potentials to define global quantities such as electric voltages and currents. The application concerns a single vertical grounding with one, two and three-layer soil, where the superior extremity stays in the surface of the soil. This problem has been modeled using a 2D axi-symmetric electrokinetic formulation. The grounding resistance obtained by finite element method is compared with the analytical one for one-layer soil. With the results of this paper it is possible to show that finite element method is a powerful tool in the analysis of the grounding systems in low frequencies. (author)

  10. Structural Analysis of Composite Laminates using Analytical and Numerical Techniques

    Directory of Open Access Journals (Sweden)

    Sanghi Divya

    2016-01-01

    Full Text Available A laminated composite material consists of different layers of matrix and fibres. Its properties can vary a lot with each layer’s or ply’s orientation, material property and the number of layers itself. The present paper focuses on a novel approach of incorporating an analytical method to arrive at a preliminary ply layup order of a composite laminate, which acts as a feeder data for the further detailed analysis done on FEA tools. The equations used in our MATLAB are based on analytical study code and supply results that are remarkably close to the final optimized layup found through extensive FEA analysis with a high probabilistic degree. This reduces significant computing time and saves considerable FEA processing to obtain efficient results quickly. The result output by our method also provides the user with the conditions that predicts the successive failure sequence of the composite plies, a result option which is not even available in popular FEM tools. The predicted results are further verified by testing the laminates in the laboratory and the results are found in good agreement.

  11. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Data Intensive Architecture for Scalable Cyber Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Bryan K.; Johnson, John R.; Critchlow, Terence J.

    2011-11-15

    Cyber analysts are tasked with the identification and mitigation of network exploits and threats. These compromises are difficult to identify due to the characteristics of cyber communication, the volume of traffic, and the duration of possible attack. It is necessary to have analytical tools to help analysts identify anomalies that span seconds, days, and weeks. Unfortunately, providing analytical tools effective access to the volumes of underlying data requires novel architectures, which is often overlooked in operational deployments. Our work is focused on a summary record of communication, called a flow. Flow records are intended to summarize a communication session between a source and a destination, providing a level of aggregation from the base data. Despite this aggregation, many enterprise network perimeter sensors store millions of network flow records per day. The volume of data makes analytics difficult, requiring the development of new techniques to efficiently identify temporal patterns and potential threats. The massive volume makes analytics difficult, but there are other characteristics in the data which compound the problem. Within the billions of records of communication that transact, there are millions of distinct IP addresses involved. Characterizing patterns of entity behavior is very difficult with the vast number of entities that exist in the data. Research has struggled to validate a model for typical network behavior with hopes it will enable the identification of atypical behavior. Complicating matters more, typically analysts are only able to visualize and interact with fractions of data and have the potential to miss long term trends and behaviors. Our analysis approach focuses on aggregate views and visualization techniques to enable flexible and efficient data exploration as well as the capability to view trends over long periods of time. Realizing that interactively exploring summary data allowed analysts to effectively identify

  13. Ball Bearing Analysis with the ORBIS Tool

    Science.gov (United States)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  14. Big Data Analytics as Input for Problem Definition and Idea Generation in Technological Design

    OpenAIRE

    Escandón-Quintanilla , Ma-Lorena; Gardoni , Mickaël; Cohendet , Patrick

    2016-01-01

    Part 10: Big Data Analytics and Business Intelligence; International audience; Big data analytics enables organizations to process massive amounts of data in shorter amounts of time and with more understanding than ever before. Many uses have been found to take advantage of this tools and techniques, especially for decision making. However, little applications have been found in the first stages of innovation, namely problem definition and idea generation. This paper discusses how big data an...

  15. Quantitative PCR is a Valuable Tool to Monitor the Performance of DNA-Encoded Chemical Library Selections.

    Science.gov (United States)

    Li, Yizhou; Zimmermann, Gunther; Scheuermann, Jörg; Neri, Dario

    2017-05-04

    Phage-display libraries and DNA-encoded chemical libraries (DECLs) represent useful tools for the isolation of specific binding molecules from large combinatorial sets of compounds. With both methods, specific binders are recovered at the end of affinity capture procedures by using target proteins of interest immobilized on a solid support. However, although the efficiency of phage-display selections is routinely quantified by counting the phage titer before and after the affinity capture step, no similar quantification procedures have been reported for the characterization of DECL selections. In this article, we describe the potential and limitations of quantitative PCR (qPCR) methods for the evaluation of selection efficiency by using a combinatorial chemical library with more than 35 million compounds. In the experimental conditions chosen for the selections, a quantification of DNA input/recovery over five orders of magnitude could be performed, revealing a successful enrichment of abundant binders, which could be confirmed by DNA sequencing. qPCR provided rapid information about the performance of selections, thus facilitating the optimization of experimental conditions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Clinical audit, a valuable tool to improve quality of care: General methodology and applications in nephrology

    Science.gov (United States)

    Esposito, Pasquale; Dal Canton, Antonio

    2014-01-01

    Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819

  17. MIDAS (Migraine Disability Assessment: a valuable tool for work-site identification of migraine in workers in Brazil

    Directory of Open Access Journals (Sweden)

    Yara Dadalti Fragoso

    Full Text Available CONTEXT: MIDAS was developed as a fast and efficient method for identification of migraine in need of medical evaluation and treatment. It was necessary to translate MIDAS, originally written in English, so as to apply it in Brazil and make it usable by individuals from a variety of social-economic-cultural backgrounds. OBJECTIVE: To translate and to apply MIDAS in Brazil. SETTING: Assessment of a sample of workers regularly employed by an oil refinery. SETTING: Refinaria Presidente Bernardes, Cubatão, São Paulo, Brazil. PARTICIPANTS: 404 workers of the company who correctly answered a questionnaire for the identification and evaluation of headache. When the individual considered it to be pertinent to his own needs, there was the option to answer MIDAS as well. METHODS: MIDAS, originally written in English, was translated into Brazilian Portuguese by a neurologist and by a translator specializing in medical texts. The final version of the translation was obtained when, for ten patients to whom it was applied, the text seemed clear and the results were consistent over three sessions. MAIN MEASUREMENTS: Prevalence and types of primary headaches, evaluation of MIDAS as a tool for identification of more severe cases. RESULTS: From the total of 419 questionnaires given to the employees, 404 were returned correctly completed. From these, 160 persons were identified as presenting headaches, 44 of whom considered it worthwhile answering MIDAS. Nine of these individuals who answered MIDAS were identified as severe cases of migraine due to disability caused by the condition. An interview on a later date confirmed these results. Three were cases of chronic daily headache (transformed migraine and six were cases of migraine. CONCLUSIONS: MIDAS translated to Brazilian Portuguese was a useful tool for identifying severe cases of migraine and of transformed migraine in a working environment. The workers did not consider MIDAS to be difficult to answer. Their

  18. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    Science.gov (United States)

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  19. Accept or Decline? An Analytics-Based Decision Tool for Kidney Offer Evaluation.

    Science.gov (United States)

    Bertsimas, Dimitris; Kung, Jerry; Trichakis, Nikolaos; Wojciechowski, David; Vagefi, Parsia A

    2017-12-01

    When a deceased-donor kidney is offered to a waitlisted candidate, the decision to accept or decline the organ relies primarily upon a practitioner's experience and intuition. Such decisions must achieve a delicate balance between estimating the immediate benefit of transplantation and the potential for future higher-quality offers. However, the current experience-based paradigm lacks scientific rigor and is subject to the inaccuracies that plague anecdotal decision-making. A data-driven analytics-based model was developed to predict whether a patient will receive an offer for a deceased-donor kidney at Kidney Donor Profile Index thresholds of 0.2, 0.4, and 0.6, and at timeframes of 3, 6, and 12 months. The model accounted for Organ Procurement Organization, blood group, wait time, DR antigens, and prior offer history to provide accurate and personalized predictions. Performance was evaluated on data sets spanning various lengths of time to understand the adaptability of the method. Using United Network for Organ Sharing match-run data from March 2007 to June 2013, out-of-sample area under the receiver operating characteristic curve was approximately 0.87 for all Kidney Donor Profile Index thresholds and timeframes considered for the 10 most populous Organ Procurement Organizations. As more data becomes available, area under the receiver operating characteristic curve values increase and subsequently level off. The development of a data-driven analytics-based model may assist transplant practitioners and candidates during the complex decision of whether to accept or forgo a current kidney offer in anticipation of a future high-quality offer. The latter holds promise to facilitate timely transplantation and optimize the efficiency of allocation.

  20. Towards an understanding of jet substructure

    CERN Document Server

    Dasgupta, Mrinal; Marzani, Simone; Salam, Gavin P

    2013-01-01

    We present first analytic, resummed calculations of the rates at which widespread jet substructure tools tag QCD jets. As well as considering trimming, pruning and the mass-drop tagger, we introduce modified tools with improved analytical and phenomenological behaviours. Most taggers have double logarithmic resummed structures. The modified mass-drop tagger is special in that it involves only single logarithms, and is free from a complex class of terms known as non-global logarithms. The modification of pruning brings an improved ability to discriminate between the different colour structures that characterise signal and background. As we outline in an extensive phenomenological discussion, these results provide valuable insight into the performance of existing tools and help lay robust foundations for future substructure studies.

  1. Collidoscope: An Improved Tool for Computing Collisional Cross-Sections with the Trajectory Method

    Science.gov (United States)

    Ewing, Simon A.; Donor, Micah T.; Wilson, Jesse W.; Prell, James S.

    2017-04-01

    Ion mobility-mass spectrometry (IM-MS) can be a powerful tool for determining structural information about ions in the gas phase, from small covalent analytes to large, native-like or denatured proteins and complexes. For large biomolecular ions, which may have a wide variety of possible gas-phase conformations and multiple charge sites, quantitative, physically explicit modeling of collisional cross sections (CCSs) for comparison to IMS data can be challenging and time-consuming. We present a "trajectory method" (TM) based CCS calculator, named "Collidoscope," which utilizes parallel processing and optimized trajectory sampling, and implements both He and N2 as collision gas options. Also included is a charge-placement algorithm for determining probable charge site configurations for protonated protein ions given an input geometry in pdb file format. Results from Collidoscope are compared with those from the current state-of-the-art CCS simulation suite, IMoS. Collidoscope CCSs are within 4% of IMoS values for ions with masses from 18 Da to 800 kDa. Collidoscope CCSs using X-ray crystal geometries are typically within a few percent of IM-MS experimental values for ions with mass up to 3.5 kDa (melittin), and discrepancies for larger ions up to 800 kDa (GroEL) are attributed in large part to changes in ion structure during and after the electrospray process. Due to its physically explicit modeling of scattering, computational efficiency, and accuracy, Collidoscope can be a valuable tool for IM-MS research, especially for large biomolecular ions.

  2. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    Science.gov (United States)

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  3. Making advanced analytics work for you.

    Science.gov (United States)

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  4. Application of the ReNuMa model in the Sha He river watershed: tools for watershed environmental management.

    Science.gov (United States)

    Sha, Jian; Liu, Min; Wang, Dong; Swaney, Dennis P; Wang, Yuqiu

    2013-07-30

    Models and related analytical methods are critical tools for use in modern watershed management. A modeling approach for quantifying the source apportionment of dissolved nitrogen (DN) and associated tools for examining the sensitivity and uncertainty of the model estimates were assessed for the Sha He River (SHR) watershed in China. The Regional Nutrient Management model (ReNuMa) was used to infer the primary sources of DN in the SHR watershed. This model is based on the Generalized Watershed Loading Functions (GWLF) and the Net Anthropogenic Nutrient Input (NANI) framework, modified to improve the characterization of subsurface hydrology and septic system loads. Hydrochemical processes of the SHR watershed, including streamflow, DN load fluxes, and corresponding DN concentration responses, were simulated following calibrations against observations of streamflow and DN fluxes. Uncertainty analyses were conducted with a Monte Carlo analysis to vary model parameters for assessing the associated variations in model outputs. The model performed accurately at the watershed scale and provided estimates of monthly streamflows and nutrient loads as well as DN source apportionments. The simulations identified the dominant contribution of agricultural land use and significant monthly variations. These results provide valuable support for science-based watershed management decisions and indicate the utility of ReNuMa for such applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Assessing Sustainability of Coral Reef Ecosystem Services using a Spatially-Explicit Decision Support Tool

    Science.gov (United States)

    Forecasting and communicating the potential outcomes of decision options requires support tools that aid in evaluating alternative scenarios in a user-friendly context and that highlight variables relevant to the decision options and valuable stakeholders. Envision is a GIS-base...

  6. Reducing Post-Decision Dissonance in International Decisions: The Analytic Hierarchy Process Approach.

    Science.gov (United States)

    DuBois, Frank L.

    1999-01-01

    Describes use of the analytic hierarchy process (AHP) as a teaching tool to illustrate the complexities of decision making in an international environment. The AHP approach uses managerial input to develop pairwise comparisons of relevant decision criteria to efficiently generate an appropriate solution. (DB)

  7. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  8. A Performance Analytical Strategy for Network-on-Chip Router with Input Buffer Architecture

    Directory of Open Access Journals (Sweden)

    WANG, J.

    2012-11-01

    Full Text Available In this paper, a performance analytical strategy is proposed for Network-on-Chip router with input buffer architecture. First, an analytical model is developed based on semi-Markov process. For the non-work-conserving router with small buffer size, the model can be used to analyze the schedule delay and the average service time for each buffer when given the related parameters. Then, the packet average delay in router is calculated by using the model. Finally, we validate the effectiveness of our strategy by simulation. By comparing our analytical results to simulation results, we show that our strategy successfully captures the Network-on-Chip router performance and it performs better than the state-of-art technology. Therefore, our strategy can be used as an efficiency performance analytical tool for Network-on-Chip design.

  9. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is presen......Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy...... is presented encompassing protein characterization prior to and after cloning of the corresponding gene....

  10. Information systems project management: methods, tools, and techniques

    OpenAIRE

    Mcmanus, John; Wood-Harper, Trevor

    2004-01-01

    Information Systems Project Management offers a clear and logical exposition of how to plan, organise and monitor projects effectively in order to deliver quality information systems within time, to budget and quality. This new book by John McManus and Trevor Wood-Harper is suitable for upper level undergraduates and postgraduates studying project management and Information Systems. Practising managers will also find it to be a valuable tool in their work. Managing information systems pro...

  11. Chemico-analytical characteristics of molybdenum(6) complex with Lumogallion IREA (Magneson IREA) in the presence of hydroxylamine

    International Nuclear Information System (INIS)

    Ivanov, V.M.; Rybakov, A.V.; Figurovskaya, V.N.; Kochelaeva, G.A.; Prokhorova, G.V.

    1997-01-01

    Complex formation of molybdenum(6) with the IREA lumogallion and IREA magneson in binary systems and in presence of hydroxylamine is studied. It is established that three-component complexes 1:1:1 are specified by more valuable analytical properties. All the complexes are formed within the wide range of pH 1.0-4.5. Reaction selectivity in the presence of hydroxylamine is studied and the methodology for molybdenum determination in the steel through IREA magneson without components separation is developed

  12. GBEP pilot Ghana. Very valuable and successful - a follow-up is suggested. Conclusions and recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Hanekamp, E.; Vissers, P.; De Lint, S. [Partners for Innovation, Amsterdam (Netherlands)

    2013-02-15

    The Global Bio-Energy Partnership (GBEP) has developed a set of 24 sustainability indicators applicable to all forms of bio-energy and aimed at voluntary use by national governments. The GBEP indicators enable governments to assess the bio-energy sector and to develop new policies related to sustainable bio-energy production and use. These indicators have been piloted in Ghana. Modern bio-energy is a big opportunity for the region, which is why NL Agency adopted and supported the pilot, together with the Global Bio-Energy Partnership (GBEP). The pilot project also was supported by the ECOWAS Regional Centre for Renewable Energy and Energy Efficiency (ECREEE) and has been coordinated by the Council for Scientific and Industrial Research (CSIR). The Ghana Energy Commission took the responsibility to involve policymakers. Partners for Innovation was commissioned by NL Agency to provide technical assistance for the pilot. The main aims of the project are: (a) Enhancing the capacity of the host country Ghana (and ECOWAS) to use the GBEP indicators as a tool for assessing the sustainability of its bio-energy sector and/or developing sustainable bio-energy policies; (b) Learning lessons on how to apply the indicators and how to enhance their practicality as a tool for policymakers and giving this as feedback to the GBEP community. Three Ghanaian research institutes (CSIR-FORIG, CSIR-IIR and UG-ISSER) have studied 11 out of the 24 GBEP indicators in the pilot. The pilot has been a success: the 24 sustainability criteria appear to be very valuable for Ghana. As such the indicators provide, also for other governments, a practical tool to assess sustainability of biomass sectors and policies. The report also shows important insights on data availability and quality, and on the applicability of the GBEP indicators in Ghana. The final report provides concrete recommendations on: (1) How Ghana can proceed with the GBEP sustainability indicators; and (2) The lessons learned for

  13. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  14. Recovering valuable metals from recycled photovoltaic modules.

    Science.gov (United States)

    Yi, Youn Kyu; Kim, Hyun Soo; Tran, Tam; Hong, Sung Kil; Kim, Myong Jun

    2014-07-01

    Recovering valuable metals such as Si, Ag, Cu, and Al has become a pressing issue as end-of-life photovoltaic modules need to be recycled in the near future to meet legislative requirements in most countries. Of major interest is the recovery and recycling of high-purity silicon (> 99.9%) for the production of wafers and semiconductors. The value of Si in crystalline-type photovoltaic modules is estimated to be -$95/kW at the 2012 metal price. At the current installed capacity of 30 GW/yr, the metal value in the PV modules represents valuable resources that should be recovered in the future. The recycling of end-of-life photovoltaic modules would supply > 88,000 and 207,000 tpa Si by 2040 and 2050, respectively. This represents more than 50% of the required Si for module fabrication. Experimental testwork on crystalline Si modules could recover a > 99.98%-grade Si product by HNO3/NaOH leaching to remove Al, Ag, and Ti and other metal ions from the doped Si. A further pyrometallurgical smelting at 1520 degrees C using CaO-CaF2-SiO2 slag mixture to scavenge the residual metals after acid leaching could finally produce > 99.998%-grade Si. A process based on HNO3/NaOH leaching and subsequent smelting is proposed for recycling Si from rejected or recycled photovoltaic modules. Implications: The photovoltaic industry is considering options of recycling PV modules to recover metals such as Si, Ag, Cu, Al, and others used in the manufacturing of the PV cells. This is to retain its "green" image and to comply with current legislations in several countries. An evaluation of potential resources made available from PV wastes and the technologies used for processing these materials is therefore of significant importance to the industry. Of interest are the costs of processing and the potential revenues gained from recycling, which should determine the viability of economic recycling of PV modules in the future.

  15. Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project

    OpenAIRE

    Gelan, Anouk; Fastré, Greet; Verjans, Martine; Martin, Niels; Janssenswillen, Gert; Creemers, Mathijs; Lieben, Jonas; Depaire, Benoît; Thomas, Michael

    2018-01-01

    Learning analytics (LA) has emerged as a field that offers promising new ways to prevent drop-out and aid retention. However, other research suggests that large datasets of learner activity can be used to understand online learning behaviour and improve pedagogy. While the use of LA in language learning has received little attention to date, available research suggests that LA could provide valuable insights into task design for instructors and materials designers, as well as help students wi...

  16. 3D Printed Paper-Based Microfluidic Analytical Devices

    Directory of Open Access Journals (Sweden)

    Yong He

    2016-06-01

    Full Text Available As a pump-free and lightweight analytical tool, paper-based microfluidic analytical devices (μPADs attract more and more interest. If the flow speed of μPAD can be programmed, the analytical sequences could be designed and they will be more popular. This reports presents a novel μPAD, driven by the capillary force of cellulose powder, printed by a desktop three-dimensional (3D printer, which has some promising features, such as easy fabrication and programmable flow speed. First, a suitable size-scale substrate with open microchannels on its surface is printed. Next, the surface of the substrate is covered with a thin layer of polydimethylsiloxane (PDMS to seal the micro gap caused by 3D printing. Then, the microchannels are filled with a mixture of cellulose powder and deionized water in an appropriate proportion. After drying in an oven at 60 °C for 30 min, it is ready for use. As the different channel depths can be easily printed, which can be used to achieve the programmable capillary flow speed of cellulose powder in the microchannels. A series of microfluidic analytical experiments, including quantitative analysis of nitrite ion and fabrication of T-sensor were used to demonstrate its capability. As the desktop 3D printer (D3DP is very cheap and accessible, this device can be rapidly printed at the test field with a low cost and has a promising potential in the point-of-care (POC system or as a lightweight platform for analytical chemistry.

  17. Next generation interactive tool as a backbone for universal access to electricity

    DEFF Research Database (Denmark)

    Moner-Girona, Magda; Puig, Daniel; Mulugetta, Yacob

    2018-01-01

    Energy planning in rural areas and in developing countries most often relies on the outputs of specialised analytical tools, of which only a handful have been developed. Over the years these tools have been upgraded, and the newest among them take into consideration, to a greater or lesser extent...

  18. User's and reference guide to the INEL RML/analytical radiochemistry sample tracking database version 1.00

    International Nuclear Information System (INIS)

    Femec, D.A.

    1995-09-01

    This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user's guide and a reference guide. The user's guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMA and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices

  19. World`s Most Valuable Brand Resonation With Categories of Different Customer Needs

    Directory of Open Access Journals (Sweden)

    Kaspars VIKSNE

    2017-09-01

    Full Text Available One of the key performance indicators of brand success is its value. Brand value is an outcome of brand`s performance in market, and is largely depended from brand`s ability to satisfy certain customer needs. For the greatest success in the world`s market brand should resonate its ability to satisfy some of customer`s most universal needs. In this paper authors strives to find out which of the needs world`s most successful brands are resonating with. Therefore paper goal is to is to determine what customer needs world`s most valuable brands are primarily satisfying. First part of paper authors briefly evaluate Maslow theory of needs. In second part of paper authors identify main challenges of brand valuation, and briefly describe today`s most valuable brands. In third part of paper authors analyzes if resonating certain human need in brand makes it to be more valuable. In last part of paper authors summarizes the main findings and gives recommendations for better marketing practices to other brands whose owners have high market ambitions. In order to attain the paper`s goal, authors will use following research methods: Comparative analysis for comparing brands in different brand rankings; Content analysis for determining what need satisfaction brand advertisements resonate; Data analysis for quantify the results gathered from content analysis

  20. 應用 Google Analytics 於數位典藏網站計量分析 A Web Metrics Study on Taiwan Baseball Wiki Using Google Analytics

    Directory of Open Access Journals (Sweden)

    Sinn-Cheng Lin

    2010-04-01

    Full Text Available 數位典藏系統的最終目的是為了提供使用者使用,因此,必須從使用者角度評估系統的設計,以充分掌握使用者需求,發掘系統潛在問題,藉以改善系統、提升品質與增進使用者滿意度。本研究選擇以使用者到訪率極高的數位典藏網站-「台灣棒球維基館」為對象,利用著名的網站計量軟體GoogleAnalytics做為主要分析工具,再輔以問卷調查結果,希望能掌握使用者的多樣面貌,包括:誰在使用?何時使用?何地使用?使用何物?如何使用q工具,有效地分析網站營運成果,解析網站使用客群,據以調整網站未來經營方向。The ultimate purpose of digital archives is to provide users with an easy access to archived data. The assessment of the website with digital archives should be from the user’s perspectives so that users’ needs can be better understood, the potential design problems can be easily revealed, and the websites can be improved accordingly. This study analyzed Taiwan Baseball Wiki, a very popular website dedicated to baseball archives in Taiwan. Google Analytics, a well-known web metrics tool, was used as the primary research instrument. In addition, a questionnaire survey was conducted to collect the users’ profile. Questions such as the 5W1H (i.e., who, when, where, why, what, and how questions regarding the users and the use of the website, and the satisfaction level of the users are investigated. The results of the research provided valuable reference for future improvement of the website, especially for the areas of website management and administration. Furthermore, this study recommended web metrics tools to other websites which provide digital archives. The application of these tools can help with understanding and optimizing web usage.