WorldWideScience

Sample records for integrated tool set

  1. APMS: An Integrated Set of Tools for Measuring Safety

    Science.gov (United States)

    Statler, Irving C.; Reynard, William D. (Technical Monitor)

    1996-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  2. Idea: an integrated set of tools for sustainable nuclear decommissioning projects

    International Nuclear Information System (INIS)

    Detilleux, M.; Centner, B.; Vanderperre, S.; Wacquier, W.

    2008-01-01

    Decommissioning of nuclear installations constitutes an important challenge and shall prove to the public that the whole nuclear life cycle is fully mastered by the nuclear industry. This could lead to an easier public acceptance of the construction of new nuclear power plants. When ceasing operation, nuclear installations owners and operators are looking for solutions in order to assess and keep decommissioning costs at a reasonable level, to fully characterise waste streams (in particular radiological inventories of difficult-to-measure radionuclides) and to reduce personnel exposure during the decommissioning activities taking into account several project, site and country specific constraints. In response to this need, Tractebel Engineering has developed IDEA (Integrated DEcommissioning Application), an integrated set of computer tools, to support the engineering activities to be carried out in the frame of a decommissioning project. IDEA provides optimized solutions from an economical, environmental, social and safety perspective. (authors)

  3. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  4. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  5. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  6. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  7. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  8. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    Energy Technology Data Exchange (ETDEWEB)

    Daye, Tony [Green Power Labs (GPL), San Diego, CA (United States)

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  9. From psychotherapy to e-therapy: the integration of traditional techniques and new communication tools in clinical settings.

    Science.gov (United States)

    Castelnuovo, Gianluca; Gaggioli, Andrea; Mantovani, Fabrizia; Riva, Giuseppe

    2003-08-01

    Technology is starting to influence psychological fields. In particular, computer-mediated communication (CMC) is providing new tools that can be fruitfully applied in psychotherapy. These new technologies do not substitute for traditional techniques and approaches but they could be used as integration in the clinical process, enhancing or making easier particular steps of it. This paper focuses on the concept of e-therapy as a new modality of helping people resolve life and relationship issues. It utilizes the power and convenience of the Internet to allow synchronous and asynchronous communication between patient and therapist. It is important to underline that e-therapy is not an alternative treatment, but a resource that can be added to traditional psychotherapy. The paper also discusses how different forms of CMC can be fruitfully applied in psychology and psychotherapy, by evaluating the effectiveness of them in the clinical practice. To enhance the diffusion of e-therapy, further research is needed to evaluate all the pros and cons.

  10. A Set of Web-based Tools for Integrating Scientific Research and Decision-Making through Systems Thinking

    Science.gov (United States)

    Currently, many policy and management decisions are made without considering the goods and services humans derive from ecosystems and the costs associated with protecting them. This approach is unlikely to be sustainable. Conceptual frameworks provide a tool for capturing, visual...

  11. Interactive Electronic Decision Trees for the Integrated Primary Care Management of Febrile Children in Low Resource Settings - Review of existing tools.

    Science.gov (United States)

    Keitel, Kristina; D'Acremont, Valérie

    2018-04-20

    The lack of effective, integrated diagnostic tools pose a major challenge to the primary care management of febrile childhood illnesses. These limitations are especially evident in low-resource settings and are often inappropriately compensated by antimicrobial over-prescription. Interactive electronic decision trees (IEDTs) have the potential to close these gaps: guiding antibiotic use and better identifying serious disease. This narrative review summarizes existing IEDTs, to provide an overview of their degree of validation, as well as to identify gaps in current knowledge and prospects for future innovation. Structured literature review in PubMed and Embase complemented by google search and contact with developers. Six integrated IEDTs were identified: three (eIMCI, REC, and Bangladesh digital IMCI) based on Integrated Management of Childhood Illnesses (IMCI); four (SL eCCM, MEDSINC, e-iCCM, and D-Tree eCCM) on Integrated Community Case Management (iCCM); two (ALMANACH, MSFeCARE) with a modified IMCI content; and one (ePOCT) that integrates novel content with biomarker testing. The types of publications and evaluation studies varied greatly: the content and evidence-base was published for two (ALMANACH and ePOCT), ALMANACH and ePOCT were validated in efficacy studies. Other types of evaluations, such as compliance, acceptability were available for D-Tree eCCM, eIMCI, ALMANACH. Several evaluations are still ongoing. Future prospects include conducting effectiveness and impact studies using data gathered through larger studies to adapt the medical content to local epidemiology, improving the software and sensors, and Assessing factors that influence compliance and scale-up. IEDTs are valuable tools that have the potential to improve management of febrile children in primary care and increase the rational use of diagnostics and antimicrobials. Next steps in the evidence pathway should be larger effectiveness and impact studies (including cost analysis) and

  12. VLM Tool for IDS Integration

    Directory of Open Access Journals (Sweden)

    Cǎtǎlin NAE

    2010-03-01

    Full Text Available This paper is dedicated to a very specific type of analysis tool (VLM - Vortex Lattice Method to be integrated in a IDS - Integrated Design System, tailored for the usage of small aircraft industry. The major interest is to have the possibility to simulate at very low computational costs a preliminary set of aerodynamic characteristics for basic aerodynamic global characteristics (Lift, Drag, Pitching Moment and aerodynamic derivatives for longitudinal and lateral-directional stability analysis. This work enables fast investigations of the influence of configuration changes in a very efficient computational environment. Using experimental data and/or CFD information for a specific calibration of VLM method, reliability of the analysis may me increased so that a first type (iteration zero aerodynamic evaluation of the preliminary 3D configuration is possible. The output of this tool is basic state aerodynamic and associated stability and control derivatives, as well as a complete set of information on specific loads on major airframe components.The major interest in using and validating this type of methods is coming from the possibility to integrate it as a tool in an IDS system for conceptual design phase, as considered for development for CESAR project (IP, UE FP6.

  13. Integrated Variable-Fidelity Tool Set For Modeling and Simulation of Aeroservothermoelasticity -Propulsion (ASTE-P) Effects For Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  14. The Overture Initiative Integrating Tools for VDM

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Battle, Nick; Ferreira, Miguel

    2010-01-01

    Overture is a community-based initiative that aims to develop a common open-source platform integrating a range of tools for constructing and analysing formal models of systems using VDM. The mission is to both provide an industrial-strength tool set for VDM and also to provide an environment...

  15. Wind Integration Data Sets | Grid Modernization | NREL

    Science.gov (United States)

    Wind Integration Data Sets Wind Integration Data Sets NREL's wind integration data sets provide the Integration Data Sets Ten-minute time-series wind data for 2004, 2005, and 2006 to help energy professionals perform wind integration studies and estimate power production from hypothetical wind power plants. Access

  16. IDMT, Integrated Decommissioning Management Tools

    International Nuclear Information System (INIS)

    Alemberti, A.; Castagna, P.; Marsiletti, M.; Orlandi, S.; Perasso, L.; Susco, M.

    2005-01-01

    Nuclear Power Plant decommissioning requires a number of demolition activities related to civil works and systems as well as the construction of temporary facilities used for treatment and conditioning of the dismantled parts. The presence of a radiological, potentially hazardous, environment due to the specific configuration and history of the plant require a professional, expert and qualified approach approved by the national safety authority. Dismantling activities must be designed, planned and analysed in detail during an evaluation phase taking into account different scenarios generated by possible dismantling sequences and specific waste treatments to be implemented. The optimisation process of the activities becomes very challenging taking into account the requirement of the minimisation of the radiological impact on exposed workers and people during normal and accident conditions. While remote operated equipment, waste treatment and conditioning facilities may be designed taking into account this primary goal also a centralised management system and corresponding software tools have to be designed and operated in order to guarantee the fulfilment of the imposed limits as well as the traceability of wastes. Ansaldo Nuclear Division has been strongly involved in the development of a qualified and certified software environment to manage the most critical activities of a decommissioning project. The IDMT system (Integrated Decommissioning Management Tools) provide a set of stand alone user friendly applications able to work in an integrated configuration to guarantee waste identification, traceability during treatment and conditioning process as well as location and identification at the Final Repository site. Additionally, the system can be used to identify, analyse and compare different specific operating scenarios to be optimised in term of both economical and radiological considerations. The paper provides an overview of the different phases of

  17. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Hahmann, Andrea N.; Nielsen, T. S.

    This poster describes the status as of April 2012 of the Public Service Obligation (PSO) funded project PSO 10464 \\Integrated Wind Power Planning Tool". The project goal is to integrate a meso scale numerical weather prediction (NWP) model with a statistical tool in order to better predict short...... term power variation from off shore wind farms, as well as to conduct forecast error assessment studies in preparation for later implementation of such a feature in an existing simulation model. The addition of a forecast error estimation feature will further increase the value of this tool, as it...

  18. Activity-Centred Tool Integration

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius

    2003-01-01

    This paper is concerned with integration of heterogeneous tools for system development. We argue that such tools should support concrete activities (e.g., programming, unit testing, conducting workshops) in contrast to abstract concerns (e.g., analysis, design, implementation). A consequence of t...... of this is that tools — or components —that support activities well should be integrated in ad-hoc, dynamic, and heterogeneous ways. We present a peer-to-peer architecture for this based on type-based publish subscribe and give an example of its use....

  19. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Giebel, Gregor; Nielsen, T. S.

    2012-01-01

    model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting......This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely...

  20. Solar Integration Data Sets | Grid Modernization | NREL

    Science.gov (United States)

    Solar Integration Data Sets Solar Integration Data Sets NREL provides the energy community with for Integration Studies Modeled solar data for energy professionals-such as transmission planners , utility planners, project developers, and university researchers-who perform solar integration studies and

  1. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    Energy Technology Data Exchange (ETDEWEB)

    Cormier, Dallas [San Diego Gas & Electric, CA (United States); Edra, Sherwin [San Diego Gas & Electric, CA (United States); Espinoza, Michael [San Diego Gas & Electric, CA (United States); Daye, Tony [Green Power Labs, San Diego, CA (United States); Kostylev, Vladimir [Green Power Labs, San Diego, CA (United States); Pavlovski, Alexandre [Green Power Labs, San Diego, CA (United States); Jelen, Deborah [Electricore, Inc., Valencia, CA (United States)

    2014-12-29

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  2. EPR design tools. Integrated data processing tools

    International Nuclear Information System (INIS)

    Kern, R.

    1997-01-01

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.) [de

  3. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    Science.gov (United States)

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  4. Ontology-based geographic data set integration

    NARCIS (Netherlands)

    Uitermark, H.T.J.A.; Uitermark, Harry T.; Oosterom, Peter J.M.; Mars, Nicolaas; Molenaar, Martien; Molenaar, M.

    1999-01-01

    In order to develop a system to propagate updates we investigate the semantic and spatial relationships between independently produced geographic data sets of the same region (data set integration). The goal of this system is to reduce operator intervention in update operations between corresponding

  5. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  6. Analytic tools for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Vladimir A.

    2012-01-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  7. Analytic tools for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Vladimir A. [Moscow State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2012-07-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  8. Evaluation of integrated data sets: four examples

    International Nuclear Information System (INIS)

    Bolivar, S.L.; Freeman, S.B.; Weaver, T.A.

    1982-01-01

    Several large data sets have been integrated and utilized for rapid evaluation on a reconnaissance scale for the Montrose 1 0 x 2 0 quadrangle, Colorado. The data sets include Landsat imagery, hydrogeochemical and stream sediment analyses, airborne geophysical data, known mineral occurrences, and a geologic map. All data sets were registered to a 179 x 119 rectangular grid and projected onto Universal Transverse Mercator coordinates. A grid resolution of 1 km was used. All possible combinations of three, for most data sets, were examined for general geologic correlations by utilizing a color microfilm output. In addition, gray-level pictures of statistical output, e.g., factor analysis, have been employed to aid evaluations. Examples for the data sets dysprosium-calcium, lead-copper-zinc, and equivalent uranium-uranium in water-uranium in sediment are described with respect to geologic applications, base-metal regimes, and geochemical associations

  9. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  10. Analytic Tools for Feynman Integrals

    CERN Document Server

    Smirnov, Vladimir A

    2012-01-01

    The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice.  This book supersedes the author’s previous Springer book “Evaluating Feynman Integrals” and its textbook version “Feynman Integral Calculus.” Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added:  One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, “Applied Asymptotic Expansions in Momenta and Masses,” by the author. This chapter describes, on t...

  11. Integrated Variable-Fidelity Tool Set for Modeling and Simulation of Aeroservothermoelasticity-Propulsion (ASTE-P) Effects for Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  12. Evaluation of Oracle Big Data Integration Tools

    OpenAIRE

    Urhan, Harun; Baranowski, Zbigniew

    2015-01-01

    Abstract The project’s objective is evaluating Oracle’s Big Data Integration Tools. The project covers evaluation of two of Oracle’s tools, Oracle Data Integrator: Application Adapters for Hadoop to load data from Oracle Database to Hadoop and Oracle SQL Connectors for HDFS to query data stored on a Hadoop file system by using SQL statements executed on an Oracle Database.

  13. Set-Valued Stochastic Equation with Set-Valued Square Integrable Martingale

    Directory of Open Access Journals (Sweden)

    Li Jun-Gang

    2017-01-01

    Full Text Available In this paper, we shall introduce the stochastic integral of a stochastic process with respect to set-valued square integrable martingale. Then we shall give the Aumann integral measurable theorem, and give the set-valued stochastic Lebesgue integral and set-valued square integrable martingale integral equation. The existence and uniqueness of solution to set-valued stochastic integral equation are proved. The discussion will be useful in optimal control and mathematical finance in psychological factors.

  14. Integrated Data Visualization and Virtual Reality Tool

    Science.gov (United States)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  15. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  16. Computational Design Tools for Integrated Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  17. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  18. Set-Valued Stochastic Lebesque Integral and Representation Theorems

    Directory of Open Access Journals (Sweden)

    Jungang Li

    2008-06-01

    Full Text Available In this paper, we shall firstly illustrate why we should introduce set-valued stochastic integrals, and then we shall discuss some properties of set-valued stochastic processes and the relation between a set-valued stochastic process and its selection set. After recalling the Aumann type definition of stochastic integral, we shall introduce a new definition of Lebesgue integral of a set-valued stochastic process with respect to the time t . Finally we shall prove the presentation theorem of set-valued stochastic integral and dis- cuss further properties that will be useful to study set-valued stochastic differential equations with their applications.

  19. Evaluating online diagnostic decision support tools for the clinical setting.

    Science.gov (United States)

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  20. Integration of g4tools in Geant4

    International Nuclear Information System (INIS)

    Hřivnáčová, Ivana

    2014-01-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  1. Biological data integration: wrapping data and tools.

    Science.gov (United States)

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  2. Ontology-based integration of topographic data sets

    NARCIS (Netherlands)

    Uitermark, HT; van Oosterom, PJM; Mars, NJI; Molenaar, M

    The integration of topographic data sets is defined as the process of establishing relationships between corresponding object instances in different, autonomously produced, topographic data sets of the same geographic space. The problem of integrating topographic data sets is in finding these

  3. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  4. GAPIT: genome association and prediction integrated tool.

    Science.gov (United States)

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  5. Aumann Type Set-valued Lebesgue Integral and Representation Theorem

    Directory of Open Access Journals (Sweden)

    Jungang Li

    2009-03-01

    Full Text Available n this paper, we shall firstly illustrate why we should discuss the Aumann type set-valued Lebesgue integral of a set-valued stochastic process with respect to time t under the condition that the set-valued stochastic process takes nonempty compact subset of d -dimensional Euclidean space. After recalling some basic results about set-valued stochastic processes, we shall secondly prove that the Aumann type set-valued Lebesgue integral of a set-valued stochastic process above is a set-valued stochastic process. Finally we shall give the representation theorem, and prove an important inequality of the Aumann type set-valued Lebesgue integrals of set-valued stochastic processes with respect to t , which are useful to study set-valued stochastic differential inclusions with applications in finance.

  6. Mathematical tools for data mining set theory, partial orders, combinatorics

    CERN Document Server

    Simovici, Dan A

    2014-01-01

    Data mining essentially relies on several mathematical disciplines, many of which are presented in this second edition of this book. Topics include partially ordered sets, combinatorics, general topology, metric spaces, linear spaces, graph theory. To motivate the reader a significant number of applications of these mathematical tools are included ranging from association rules, clustering algorithms, classification, data constraints, logical data analysis, etc. The book is intended as a reference for researchers and graduate students. The current edition is a significant expansion of the firs

  7. Validation of the TRUST tool in a Greek perioperative setting.

    Science.gov (United States)

    Chatzea, Vasiliki-Eirini; Sifaki-Pistolla, Dimitra; Dey, Nilanjan; Melidoniotis, Evangelos

    2017-06-01

    The aim of this study was to translate, culturally adapt and validate the TRUST questionnaire in a Greek perioperative setting. The TRUST questionnaire assesses the relationship between trust and performance. The study assessed the levels of trust and performance in the surgery and anaesthesiology department during a very stressful period for Greece (economic crisis) and offered a user friendly and robust assessment tool. The study concludes that the Greek version of the TRUST questionnaire is a reliable and valid instrument for measuring team performance among Greek perioperative teams. Copyright the Association for Perioperative Practice.

  8. An Integrative Review of Pediatric Fall Risk Assessment Tools.

    Science.gov (United States)

    DiGerolamo, Kimberly; Davis, Katherine Finn

    Patient fall prevention begins with accurate risk assessment. However, sustained improvements in prevention and quality of care include use of validated fall risk assessment tools (FRATs). The goal of FRATs is to identify patients at highest risk. Adult FRATs are often borrowed from to create tools for pediatric patients. Though factors associated with pediatric falls in the hospital setting are similar to those in adults, such as mobility, medication use, and cognitive impairment, adult FRATs and the factors associated with them do not adequately assess risk in children. Articles were limited to English language, ages 0-21years, and publish date 2006-2015. The search yielded 22 articles. Ten were excluded as the population was primarily adult or lacked discussion of a FRAT. Critical appraisal and findings were synthesized using the Johns Hopkins Nursing evidence appraisal system. Twelve articles relevant to fall prevention in the pediatric hospital setting that discussed fall risk assessment and use of a FRAT were reviewed. Comparison between and accuracy of FRATs is challenged when different classifications, definitions, risk stratification, and inclusion criteria are used. Though there are several pediatric FRATs published in the literature, none have been found to be reliable and valid across institutions and diverse populations. This integrative review highlights the importance of choosing a FRAT based on an institution's identified risk factors and validating the tool for one's own patient population as well as using the tool in conjunction with nursing clinical judgment to guide interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Data Integration Tool: Permafrost Data Debugging

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.

    2017-12-01

    We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  10. Workplace wellness using online learning tools in a healthcare setting.

    Science.gov (United States)

    Blake, Holly; Gartshore, Emily

    2016-09-01

    The aim was to develop and evaluate an online learning tool for use with UK healthcare employees, healthcare educators and healthcare students, to increase knowledge of workplace wellness as an important public health issue. A 'Workplace Wellness' e-learning tool was developed and peer-reviewed by 14 topic experts. This focused on six key areas relating to workplace wellness: work-related stress, musculoskeletal disorders, diet and nutrition, physical activity, smoking and alcohol consumption. Each key area provided current evidence-based information on causes and consequences, access to UK government reports and national statistics, and guidance on actions that could be taken to improve health within a workplace setting. 188 users (93.1% female, age 18-60) completed online knowledge questionnaires before (n = 188) and after (n = 88) exposure to the online learning tool. Baseline knowledge of workplace wellness was poor (n = 188; mean accuracy 47.6%, s.d. 11.94). Knowledge significantly improved from baseline to post-intervention (mean accuracy = 77.5%, s.d. 13.71) (t(75) = -14.801, p online learning, indicating scope for development of further online packages relating to other important health parameters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  12. Design Tools for Integrated Asynchronous Electronic Circuits

    National Research Council Canada - National Science Library

    Martin, Alain

    2003-01-01

    ..., simulation, verification, at the logical and physical levels. Situs has developed a business model for the commercialization of the CAD tools, and has designed the prototype of the tool suite based on this business model and the Caltech approach...

  13. Integrating New Technologies and Existing Tools to Promote Programming Learning

    Directory of Open Access Journals (Sweden)

    Álvaro Santos

    2010-04-01

    Full Text Available In recent years, many tools have been proposed to reduce programming learning difficulties felt by many students. Our group has contributed to this effort through the development of several tools, such as VIP, SICAS, OOP-Anim, SICAS-COL and H-SICAS. Even though we had some positive results, the utilization of these tools doesn’t seem to significantly reduce weaker student’s difficulties. These students need stronger support to motivate them to get engaged in learning activities, inside and outside classroom. Nowadays, many technologies are available to create contexts that may help to accomplish this goal. We consider that a promising path goes through the integration of solutions. In this paper we analyze the features, strengths and weaknesses of the tools developed by our group. Based on these considerations we present a new environment, integrating different types of pedagogical approaches, resources, tools and technologies for programming learning support. With this environment, currently under development, it will be possible to review contents and lessons, based on video and screen captures. The support for collaborative tasks is another key point to improve and stimulate different models of teamwork. The platform will also allow the creation of various alternative models (learning objects for the same subject, enabling personalized learning paths adapted to each student knowledge level, needs and preferential learning styles. The learning sequences will work as a study organizer, following a suitable taxonomy, according to student’s cognitive skills. Although the main goal of this environment is to support students with more difficulties, it will provide a set of resources supporting the learning of more advanced topics. Software engineering techniques and representations, object orientation and event programming are features that will be available in order to promote the learning progress of students.

  14. Goal setting: an integral component of effective diabetes care.

    Science.gov (United States)

    Miller, Carla K; Bauman, Jennifer

    2014-08-01

    Goal setting is a widely used behavior change tool in diabetes education and training. Prior research found specific relatively difficult but attainable goals set within a specific timeframe improved performance in sports and at the workplace. However, the impact of goal setting in diabetes self-care has not received extensive attention. This review examined the mechanisms underlying behavioral change according to goal setting theory and evaluated the impact of goal setting in diabetes intervention studies. Eight studies were identified, which incorporated goal setting as the primary strategy to promote behavioral change in individual, group-based, and primary care settings among patients with type 2 diabetes. Improvements in diabetes-related self-efficacy, dietary intake, physical activity, and A1c were observed in some but not all studies. More systematic research is needed to determine the conditions and behaviors for which goal setting is most effective. Initial recommendations for using goal setting in diabetes patient encounters are offered.

  15. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....

  16. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering

    2012-06-30

    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  17. Tool set for distributed real-time machine control

    Science.gov (United States)

    Carrott, Andrew J.; Wright, Christopher D.; West, Andrew A.; Harrison, Robert; Weston, Richard H.

    1997-01-01

    Demands for increased control capabilities require next generation manufacturing machines to comprise intelligent building elements, physically located at the point where the control functionality is required. Networks of modular intelligent controllers are increasingly designed into manufacturing machines and usable standards are slowly emerging. To implement a control system using off-the-shelf intelligent devices from multi-vendor sources requires a number of well defined activities, including (a) the specification and selection of interoperable control system components, (b) device independent application programming and (c) device configuration, management, monitoring and control. This paper briefly discusses the support for the above machine lifecycle activities through the development of an integrated computing environment populated with an extendable software toolset. The toolset supports machine builder activities such as initial control logic specification, logic analysis, machine modeling, mechanical verification, application programming, automatic code generation, simulation/test, version control, distributed run-time support and documentation. The environment itself consists of system management tools and a distributed object-oriented database which provides storage for the outputs from machine lifecycle activities and specific target control solutions.

  18. Force feedback facilitates multisensory integration during robotic tool use

    NARCIS (Netherlands)

    Sengül, A.; Rognini, G.; van Elk, M.; Aspell, J.E.; Bleuler, H.; Blanke, O.

    2013-01-01

    The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal

  19. Integrating total quality management in a library setting

    CERN Document Server

    Jurow, Susan

    2013-01-01

    Improve the delivery of library services by implementing total quality management (TQM), a system of continuous improvement employing participative management and centered on the needs of customers. Although TQM was originally designed for and successfully applied in business and manufacturing settings, this groundbreaking volume introduces strategies for translating TQM principles from the profit-based manufacturing sector to the library setting. Integrating Total Quality Management in a Library Setting shows librarians how to improve library services by implementing strategies such as employ

  20. Integrated Tools for Future Distributed Engine Control Technologies

    Science.gov (United States)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  1. Debating Life on Mars: The Knowledge Integration Environment (KIE) in Varied School Settings.

    Science.gov (United States)

    Shear, Linda

    Technology-enabled learning environments are beginning to come of age. Tools and frameworks are now available that have been shown to improve learning and are being deployed more widely in varied school settings. Teachers are now faced with the formidable challenge of integrating these promising new environments with the everyday context in which…

  2. Integrating Risk Analyses and Tools at the DOE Hanford Site

    International Nuclear Information System (INIS)

    LOBER, R.W.

    2002-01-01

    Risk assessment and environmental impact analysis at the U.S. Department of Energy (DOE) Hanford Site in Washington State has made significant progress in refining the strategy for using risk analysis to support closing of several hundred waste sites plus 149 single-shell tanks at the Hanford Site. A Single-Shell Tank System Closure Work Plan outlines the current basis for closing the single-shell tank systems. An analogous site approach has been developed to address closure of aggregated groups of similar waste sites. Because of the complexity, decision time frames, proximity of non-tank farm waste sites to tank farms, scale, and regulatory considerations, various projects are providing integrated assessments to support risk analyses and decision-making. Projects and the tools that are being developed and applied at Hanford to support retrieval and cleanup decisions include: (1) Life Cycle Model (LCM) and Risk Receptor Model (RRM)--A site-level set of tools to support strategic analyses through scoping level risk management to assess different alternatives and options for tank closure. (2) Systems Assessment Capability for Integrated Groundwater Nadose Zone (SAC) and the Site-Wide Groundwater Model (SWGM)--A site-wide groundwater modeling system coupled with a risk-based uncertainty analysis of inventory, vadose zone, groundwater, and river interactions for evaluating cumulative impacts from individual and aggregate waste sites. (3) Retrieval Performance Evaluation (RPE)--A site-specific, risk-based methodology developed to evaluate performance of waste retrieval, leak detection and closure on a tank-specific basis as a function of past tank Leaks, potential leakage during retrieval operations, and remaining residual waste inventories following completion of retrieval operations. (4) Field Investigation Report (FIR)--A corrective action program to investigate the nature and extent of past tank leaks through characterization activities and assess future impacts to

  3. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    Science.gov (United States)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  4. Integrated Design Tools for Embedded Control Systems

    NARCIS (Netherlands)

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Karelse, F.

    2001-01-01

    Currently, computer-based control systems are still being implemented using the same techniques as 10 years ago. The purpose of this project is the development of a design framework, consisting of tools and libraries, which allows the designer to build high reliable heterogeneous real-time embedded

  5. Lessons learned from tool integration with OSLC

    NARCIS (Netherlands)

    Leitner, A.; Herbst, B.; Mathijssen, R.

    2016-01-01

    Today’s embedded and cyber-physical systems are getting more connected and complex. One main challenge during development is the often loose coupling between engineering tools, which could lead to inconsistencies and errors due to the manual transfer and duplication of data. Open formats and

  6. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  7. Integrated Design Tools for Embedded Control Systems

    OpenAIRE

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Karelse, F.

    2001-01-01

    Currently, computer-based control systems are still being implemented using the same techniques as 10 years ago. The purpose of this project is the development of a design framework, consisting of tools and libraries, which allows the designer to build high reliable heterogeneous real-time embedded systems in a very short time at a fraction of the present day costs. The ultimate focus of current research is on transformation control laws to efficient concurrent algorithms, with concerns about...

  8. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  9. Risk Informed Design Using Integrated Vehicle Rapid Assessment Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — A successful proof of concept was performed in FY 2012 integrating the Envision tool for parametric estimates of vehicle mass and the Rapid Response Risk Assessment...

  10. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  11. Nutrition screening tools: Does one size fit all? A systematic review of screening tools for the hospital setting

    NARCIS (Netherlands)

    van Bokhorst-de van der Schueren, M.A.E.; Guaitoli, P.R.; Jansma, E.P.; de Vet, H.C.W.

    2014-01-01

    Background & aims: Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. Methods: A systematic review of

  12. Integrated Land-Water-Energy assessment using the Foreseer Tool

    Science.gov (United States)

    Allwood, Julian; Konadu, Dennis; Mourao, Zenaida; Lupton, Rick; Richards, Keith; Fenner, Richard; Skelton, Sandy; McMahon, Richard

    2016-04-01

    This study presents an integrated energy and resource modelling and visualisation approach, ForeseerTM, which characterises the interdependencies and evaluates the land and water requirement for energy system pathways. The Foreseer Tool maps linked energy, water and land resource futures by outputting a set of Sankey diagrams for energy, water and land, showing the flow from basic resource (e.g. coal, surface water, and forested land) through transformations (e.g. fuel refining and desalination) to final services (e.g. sustenance, hygiene and transportation). By 'mapping' resources in this way, policy-makers can more easily understand the competing uses through the identification of the services it delivers (e.g. food production, landscaping, energy), the potential opportunities for improving the management of the resource and the connections with other resources which are often overlooked in a traditional sector-based management strategy. This paper will present a case study of the UK Carbon Plan, and highlights the need for integrated resource planning and policy development.

  13. A tool to guide the process of integrating health system responses to public health problems

    Directory of Open Access Journals (Sweden)

    Tilahun Nigatu Haregu

    2015-06-01

    Full Text Available An integrated model of health system responses to public health problems is considered to be the most preferable approach. Accordingly, there are several models that stipulate what an integrated architecture should look like. However, tools that can guide the overall process of integration are lacking. This tool is designed to guide the entire process of integration of health system responses to major public health problems. It is developed by taking into account the contexts of health systems of developing countries and the emergence of double-burden of chronic diseases in these settings. Chronic diseases – HIV/AIDS and NCDs – represented the evidence base for the development of the model. System level horizontal integration of health system responses were considered in the development of this tool.

  14. Social Work Student and Practitioner Roles in Integrated Care Settings.

    Science.gov (United States)

    Fraher, Erin P; Richman, Erica Lynn; Zerden, Lisa de Saxe; Lombardi, Brianna

    2018-06-01

    Social workers are increasingly being deployed in integrated medical and behavioral healthcare settings but information about the roles they fill in these settings is not well understood. This study sought to identify the functions that social workers perform in integrated settings and identify where they acquired the necessary skills to perform them. Master of social work students (n=21) and their field supervisors (n=21) who were part of a Health Resources and Services Administration-funded program to train and expand the behavioral health workforce in integrated settings were asked how often they engaged in 28 functions, where they learned to perform those functions, and the degree to which their roles overlapped with others on the healthcare team. The most frequent functions included employing cultural competency, documenting in the electronic health record, addressing patient social determinants of health, and participating in team-based care. Respondents were least likely to engage in case conferences; use Screening, Brief Intervention and Referral to Treatment; use stepped care to determine necessary level of treatment; conduct functional assessments of daily living skills; use behavioral activation; and use problem-solving therapy. A total of 80% of respondents reported that their roles occasionally, often, very often, or always overlapped with others on the healthcare team. Students reported learning the majority of skills (76%) in their Master of Social Work programs. Supervisors attributed the majority (65%) of their skill development to on-the-job training. Study findings suggest the need to redesign education, regulatory, and payment to better support the deployment of social workers in integrated care settings. This article is part of a supplement entitled The Behavioral Health Workforce: Planning, Practice, and Preparation, which is sponsored by the Substance Abuse and Mental Health Services Administration and the Health Resources and Services

  15. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  16. Blended Learning Tools in Geosciences: A New Set of Online Tools to Help Students Master Skills

    Science.gov (United States)

    Cull, S.; Spohrer, J.; Natarajan, S.; Chin, M.

    2013-12-01

    In most geoscience courses, students are expected to develop specific skills. To master these skills, students need to practice them repeatedly. Unfortunately, few geosciences courses have enough class time to allow students sufficient in-class practice, nor enough instructor attention and time to provide fast feedback. To address this, we have developed an online tool called an Instant Feedback Practice (IFP). IFPs are low-risk, high-frequency exercises that allow students to practice skills repeatedly throughout a semester, both in class and at home. After class, students log onto a course management system (like Moodle or Blackboard), and click on that day's IFP exercise. The exercise might be visually identifying a set of minerals that they're practicing. After answering each question, the IFP tells them if they got it right or wrong. If they got it wrong, they try again until they get it right. There is no penalty - students receive the full score for finishing. The goal is low-stakes practice. By completing dozens of these practices throughout the semester, students have many, many opportunities to practice mineral identification with quick feedback. Students can also complete IFPs during class in groups and teams, with in-lab hand samples or specimens. IFPs can also be used to gauge student skill levels as the semester progresses, as they can be set up to provide the instructor feedback on specific skills or students. When IFPs were developed for and implemented in a majors-level mineralogy class, students reported that in-class and online IFPs were by far the most useful technique they used to master mineral hand sample identification. Final grades in the course were significantly higher than historical norms, supporting students' anecdotal assessment of the impact of IFPs on their learning.

  17. Cancer survival classification using integrated data sets and intermediate information.

    Science.gov (United States)

    Kim, Shinuk; Park, Taesung; Kon, Mark

    2014-09-01

    Although numerous studies related to cancer survival have been published, increasing the prediction accuracy of survival classes still remains a challenge. Integration of different data sets, such as microRNA (miRNA) and mRNA, might increase the accuracy of survival class prediction. Therefore, we suggested a machine learning (ML) approach to integrate different data sets, and developed a novel method based on feature selection with Cox proportional hazard regression model (FSCOX) to improve the prediction of cancer survival time. FSCOX provides us with intermediate survival information, which is usually discarded when separating survival into 2 groups (short- and long-term), and allows us to perform survival analysis. We used an ML-based protocol for feature selection, integrating information from miRNA and mRNA expression profiles at the feature level. To predict survival phenotypes, we used the following classifiers, first, existing ML methods, support vector machine (SVM) and random forest (RF), second, a new median-based classifier using FSCOX (FSCOX_median), and third, an SVM classifier using FSCOX (FSCOX_SVM). We compared these methods using 3 types of cancer tissue data sets: (i) miRNA expression, (ii) mRNA expression, and (iii) combined miRNA and mRNA expression. The latter data set included features selected either from the combined miRNA/mRNA profile or independently from miRNAs and mRNAs profiles (IFS). In the ovarian data set, the accuracy of survival classification using the combined miRNA/mRNA profiles with IFS was 75% using RF, 86.36% using SVM, 84.09% using FSCOX_median, and 88.64% using FSCOX_SVM with a balanced 22 short-term and 22 long-term survivor data set. These accuracies are higher than those using miRNA alone (70.45%, RF; 75%, SVM; 75%, FSCOX_median; and 75%, FSCOX_SVM) or mRNA alone (65.91%, RF; 63.64%, SVM; 72.73%, FSCOX_median; and 70.45%, FSCOX_SVM). Similarly in the glioblastoma multiforme data, the accuracy of miRNA/mRNA using IFS

  18. Electronic Mail in Academic Settings: A Multipurpose Communications Tool.

    Science.gov (United States)

    D'Souza, Patricia Veasey

    1992-01-01

    Explores possible uses of electronic mail in three areas of the academic setting: instruction, research, and administration. Electronic mail is defined, the components needed to get started with electronic mail are discussed, and uses and benefits of electronic mail in diverse educational environments are suggested. (12 references) (DB)

  19. Integration between a sales support system and a simulation tool

    OpenAIRE

    Wahlström, Ola

    2005-01-01

    InstantPlanner is a sales support system for the material handling industry, visualizing and calculating designs faster and more correctly than other tools on the market. AutoMod is a world leading simulation tool used in the material handling industry to optimize and calculate appropriate configuration designs. Both applications are favorable in their own area provide a great platform for integration with the properties of fast designing, correct product calculations, great simulation capabi...

  20. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  1. WINS. Market Simulation Tool for Facilitating Wind Energy Integration

    Energy Technology Data Exchange (ETDEWEB)

    Shahidehpour, Mohammad [Illinois Inst. of Technology, Chicago, IL (United States)

    2012-10-30

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practices can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision

  2. Development of a multilevel health and safety climate survey tool within a mining setting.

    Science.gov (United States)

    Parker, Anthony W; Tones, Megan J; Ritchie, Gabrielle E

    2017-09-01

    This study aimed to design, implement and evaluate the reliability and validity of a multifactorial and multilevel health and safety climate survey (HSCS) tool with utility in the Australian mining setting. An 84-item questionnaire was developed and pilot tested on a sample of 302 Australian miners across two open cut sites. A 67-item, 10 factor solution was obtained via exploratory factor analysis (EFA) representing prioritization and attitudes to health and safety across multiple domains and organizational levels. Each factor demonstrated a high level of internal reliability, and a series of ANOVAs determined a high level of consistency in responses across the workforce, and generally irrespective of age, experience or job category. Participants tended to hold favorable views of occupational health and safety (OH&S) climate at the management, supervisor, workgroup and individual level. The survey tool demonstrated reliability and validity for use within an open cut Australian mining setting and supports a multilevel, industry specific approach to OH&S climate. Findings suggested a need for mining companies to maintain high OH&S standards to minimize risks to employee health and safety. Future research is required to determine the ability of this measure to predict OH&S outcomes and its utility within other mine settings. As this tool integrates health and safety, it may have benefits for assessment, monitoring and evaluation in the industry, and improving the understanding of how health and safety climate interact at multiple levels to influence OH&S outcomes. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  3. A simulator tool set for evaluating HEVC/SHVC streaming

    Science.gov (United States)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to

  4. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  5. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  6. Advertising Can Be an Effective Integrated Marketing Tool

    Science.gov (United States)

    Lauer, Larry D.

    2007-01-01

    Advertising will not undermine the critical thinking of consumers when it is combined with other communication media, and when it is truthful. In fact, it can provide clarity about the competitive advantage of individual institutions and aid an individual's ability to choose wisely. Advertising is just one of the tools in the integrated marketing…

  7. Uni- and omnidirectional simulation tools for integrated optics

    NARCIS (Netherlands)

    Stoffer, Remco

    2001-01-01

    This thesis presents several improvements on simulation methods in integrated optics, as well as some new methods. Both uni- and omnidirectional tools are presented; for the unidirectional methods, the emphasis is on higher-order accuracy; for the omnidirectional methods, the boundary conditions are

  8. KAIKObase: An integrated silkworm genome database and data mining tool

    Directory of Open Access Journals (Sweden)

    Nagaraju Javaregowda

    2009-10-01

    Full Text Available Abstract Background The silkworm, Bombyx mori, is one of the most economically important insects in many developing countries owing to its large-scale cultivation for silk production. With the development of genomic and biotechnological tools, B. mori has also become an important bioreactor for production of various recombinant proteins of biomedical interest. In 2004, two genome sequencing projects for B. mori were reported independently by Chinese and Japanese teams; however, the datasets were insufficient for building long genomic scaffolds which are essential for unambiguous annotation of the genome. Now, both the datasets have been merged and assembled through a joint collaboration between the two groups. Description Integration of the two data sets of silkworm whole-genome-shotgun sequencing by the Japanese and Chinese groups together with newly obtained fosmid- and BAC-end sequences produced the best continuity (~3.7 Mb in N50 scaffold size among the sequenced insect genomes and provided a high degree of nucleotide coverage (88% of all 28 chromosomes. In addition, a physical map of BAC contigs constructed by fingerprinting BAC clones and a SNP linkage map constructed using BAC-end sequences were available. In parallel, proteomic data from two-dimensional polyacrylamide gel electrophoresis in various tissues and developmental stages were compiled into a silkworm proteome database. Finally, a Bombyx trap database was constructed for documenting insertion positions and expression data of transposon insertion lines. Conclusion For efficient usage of genome information for functional studies, genomic sequences, physical and genetic map information and EST data were compiled into KAIKObase, an integrated silkworm genome database which consists of 4 map viewers, a gene viewer, and sequence, keyword and position search systems to display results and data at the level of nucleotide sequence, gene, scaffold and chromosome. Integration of the

  9. A validated set of tool pictures with matched objects and non-objects for laterality research.

    Science.gov (United States)

    Verma, Ark; Brysbaert, Marc

    2015-01-01

    Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.

  10. THE MANAGEMENT ACCOUNTING TOOLS AND THE INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Gabriel JINGA

    2015-04-01

    Full Text Available During the recent years the stakeholders are asking for other pieces of information to be published along with the financial one, such as risk reporting, intangibles, social and environmental accounting. The type of corporate reporting which incorporates the elements enumerated above is the integrated reporting. In this article, we argue that the information disclosed in the integrated reports is prepared by the management accounting, not only by the financial accounting. Thus, we search for the management accounting tools which are used by the companies which prepare integrated reports. In order to do this, we analytically reviewed all the reports available on the website of a selected company. Our results show that the company is using most of the management accounting tools mentioned in the literature review part.

  11. Integrative set enrichment testing for multiple omics platforms

    Directory of Open Access Journals (Sweden)

    Poisson Laila M

    2011-11-01

    Full Text Available Abstract Background Enrichment testing assesses the overall evidence of differential expression behavior of the elements within a defined set. When we have measured many molecular aspects, e.g. gene expression, metabolites, proteins, it is desirable to assess their differential tendencies jointly across platforms using an integrated set enrichment test. In this work we explore the properties of several methods for performing a combined enrichment test using gene expression and metabolomics as the motivating platforms. Results Using two simulation models we explored the properties of several enrichment methods including two novel methods: the logistic regression 2-degree of freedom Wald test and the 2-dimensional permutation p-value for the sum-of-squared statistics test. In relation to their univariate counterparts we find that the joint tests can improve our ability to detect results that are marginal univariately. We also find that joint tests improve the ranking of associated pathways compared to their univariate counterparts. However, there is a risk of Type I error inflation with some methods and self-contained methods lose specificity when the sets are not representative of underlying association. Conclusions In this work we show that consideration of data from multiple platforms, in conjunction with summarization via a priori pathway information, leads to increased power in detection of genomic associations with phenotypes.

  12. Assess the flood resilience tools integration in the landuse projects

    Science.gov (United States)

    Moulin, E.; Deroubaix, J.-F.

    2012-04-01

    Despite a severe regulation concerning the building in flooding areas, 80% of these areas are already built in the Greater Paris (Paris, Val-de-Marne, Hauts-de-Seine and Seine-Saint-Denis). The land use in flooding area is presented as one of the main solutions to solve the ongoing real estate pressure. For instance some of the industrial wastelands located along the river are currently in redevelopment and residential buildings are planned. So the landuse in the flooding areas is currently a key issue in the development of the Greater Paris area. To deal with floods there are some resilience tools, whether structural (such as perimeter barriers or building aperture barriers, etc) or non structural (such as warning systems, etc.). The technical solutions are available and most of the time efficient1. Still, we notice that these tools are not much implemented. The people; stakeholders and inhabitants, literally seems to be not interested. This papers focus on the integration of resilience tools in urban projects. Indeed one of the blockages in the implementation of an efficient flood risk prevention policy is the lack of concern of the landuse stakeholders and the inhabitants for the risk2. We conducted an important number of interviews with stakeholders involved in various urban projects and we assess, in this communication, to what extent the improvement of the resilience to floods is considered as a main issue in the execution of an urban project? How this concern is maintained or could be maintained throughout the project. Is there a dilution of this concern? In order to develop this topic we rely on a case study. The "Ardoines" is a project aiming at redeveloping an industrial site (South-East Paris), into a project including residential and office buildings and other amenities. In order to elaborate the master plan, the urban planning authority brought together some flood risk experts. According to the comments of the experts, the architect in charge of the

  13. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  14. Laboratory informatics tools integration strategies for drug discovery: integration of LIMS, ELN, CDS, and SDMS.

    Science.gov (United States)

    Machina, Hari K; Wild, David J

    2013-04-01

    There are technologies on the horizon that could dramatically change how informatics organizations design, develop, deliver, and support applications and data infrastructures to deliver maximum value to drug discovery organizations. Effective integration of data and laboratory informatics tools promises the ability of organizations to make better informed decisions about resource allocation during the drug discovery and development process and for more informed decisions to be made with respect to the market opportunity for compounds. We propose in this article a new integration model called ELN-centric laboratory informatics tools integration.

  15. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna; Tramontano, Anna; Marcatili, Paolo

    2011-01-01

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  16. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna

    2011-11-10

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  17. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  18. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  19. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  20. Nutrition screening tools: does one size fit all? A systematic review of screening tools for the hospital setting.

    Science.gov (United States)

    van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W

    2014-02-01

    Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  1. Sharing clinical information across care settings: the birth of an integrated assessment system

    Directory of Open Access Journals (Sweden)

    Henrard Jean-Claude

    2009-04-01

    Full Text Available Abstract Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training.

  2. Sharing clinical information across care settings: the birth of an integrated assessment system

    Science.gov (United States)

    Gray, Leonard C; Berg, Katherine; Fries, Brant E; Henrard, Jean-Claude; Hirdes, John P; Steel, Knight; Morris, John N

    2009-01-01

    Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training. PMID:19402891

  3. Indico central - events organisation, ergonomics and collaboration tools integration

    International Nuclear Information System (INIS)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  4. Indico central - events organisation, ergonomics and collaboration tools integration

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas, E-mail: jose.benito.gonzalez@cern.c, E-mail: jose.pedro.ferreira@cern.c, E-mail: thomas.baron@cern.c [CERN IT-UDS-AVC, 1211 Geneve 23 (Switzerland)

    2010-04-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  5. Indico Central - Events Organisation, Ergonomics and Collaboration Tools Integration

    CERN Document Server

    Gonzalez Lopez, J B; Baron, T; CERN. Geneva. IT Department

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  6. Automated innovative diagnostic, data management and communication tool, for improving malaria vector control in endemic settings.

    Science.gov (United States)

    Vontas, John; Mitsakakis, Konstantinos; Zengerle, Roland; Yewhalaw, Delenasaw; Sikaala, Chadwick Haadezu; Etang, Josiane; Fallani, Matteo; Carman, Bill; Müller, Pie; Chouaïbou, Mouhamadou; Coleman, Marlize; Coleman, Michael

    2016-01-01

    Malaria is a life-threatening disease that caused more than 400,000 deaths in sub-Saharan Africa in 2015. Mass prevention of the disease is best achieved by vector control which heavily relies on the use of insecticides. Monitoring mosquito vector populations is an integral component of control programs and a prerequisite for effective interventions. Several individual methods are used for this task; however, there are obstacles to their uptake, as well as challenges in organizing, interpreting and communicating vector population data. The Horizon 2020 project "DMC-MALVEC" consortium will develop a fully integrated and automated multiplex vector-diagnostic platform (LabDisk) for characterizing mosquito populations in terms of species composition, Plasmodium infections and biochemical insecticide resistance markers. The LabDisk will be interfaced with a Disease Data Management System (DDMS), a custom made data management software which will collate and manage data from routine entomological monitoring activities providing information in a timely fashion based on user needs and in a standardized way. The ResistanceSim, a serious game, a modern ICT platform that uses interactive ways of communicating guidelines and exemplifying good practices of optimal use of interventions in the health sector will also be a key element. The use of the tool will teach operational end users the value of quality data (relevant, timely and accurate) to make informed decisions. The integrated system (LabDisk, DDMS & ResistanceSim) will be evaluated in four malaria endemic countries, representative of the vector control challenges in sub-Saharan Africa, (Cameroon, Ivory Coast, Ethiopia and Zambia), highly representative of malaria settings with different levels of endemicity and vector control challenges, to support informed decision-making in vector control and disease management.

  7. Knowledge Management tools integration within DLR's concurrent engineering facility

    Science.gov (United States)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  8. LEARNING TOOLS INTEROPERABILITY – A NEW STANDARD FOR INTEGRATION OF DISTANCE LEARNING PLATFORMS

    Directory of Open Access Journals (Sweden)

    Oleksandr A. Shcherbyna

    2015-06-01

    Full Text Available For information technology in education there is always an issue of re-usage of electronic educational resources, their transferring possibility from one virtual learning environment to another. Previously, standardized sets of files were used to serve this purpose, for example, SCORM-packages. In this article the new standard Learning Tools Interoperability (LTI is reviewed, which allows users from one environment to access resources from another environment. This makes it possible to integrate them into a single distributed learning environment that is created and shared. The article gives examples of the practical use of standard LTI in Moodle learning management system using External tool and LTI provider plugins.

  9. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  10. Integrated environmental decision support tool based on GIS technology

    International Nuclear Information System (INIS)

    Doctor, P.G.; O'Neil, T.K.; Sackschewsky, M.R.; Becker, J.M.; Rykiel, E.J.; Walters, T.B.; Brandt, C.A.; Hall, J.A.

    1995-01-01

    Environmental restoration and management decisions facing the US Department of Energy require balancing trade-offs between diverse land uses and impacts over multiple spatial and temporal scales. Many types of environmental data have been collected for the Hanford Site and the Columbia River in Washington State over the past fifty years. Pacific Northwest National Laboratory (PNNL) is integrating these data into a Geographic Information System (GIS) based computer decision support tool. This tool provides a comprehensive and concise description of the current environmental landscape that can be used to evaluate the ecological and monetary trade-offs between future land use, restoration and remediation options before action is taken. Ecological impacts evaluated include effects to individual species of concern and habitat loss and fragmentation. Monetary impacts include those associated with habitat mitigation. The tool is organized as both a browsing tool for educational purposes, and as a framework that leads a project manager through the steps needed to be in compliance with environmental requirements

  11. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    Science.gov (United States)

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  12. Zebrafish Expression Ontology of Gene Sets (ZEOGS): A Tool to Analyze Enrichment of Zebrafish Anatomical Terms in Large Gene Sets

    Science.gov (United States)

    Marsico, Annalisa

    2013-01-01

    Abstract The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene

  13. A new tool for man/machine integration

    International Nuclear Information System (INIS)

    Sommer, W.C.

    1981-01-01

    A popular term within the nuclear power industry today, as a result of TMI, is man/machine interface. It has been determined that greater acknowledgement of this interface is necessary within the industry to integrate the design and operational aspects of a system. What is required is an operational tool that can be used early in the engineering stages of a project and passed on later in time to those who will be responsible to operate that particular system. This paper discusses one such fundamental operations tool that is applied to a process system, its display devices, and its operator actions in a methodical fashion to integrate the machine for man's understanding and proper use. This new tool, referred to as an Operational Schematic, is shown and described. Briefly, it unites, in one location, the important operational display devices with the system process devices. A man can now see the beginning and end of each information and control loop to better understand its function within the system. A method is presented whereby in designing for operability, the schematic is utilized in three phases. The method results in two basic documents, one describes ''what'' is to be operated and the other ''how'' it is to be operated. This integration concept has now considered the hardware spectrum from sensor-to-display and operated the display (on paper) to confirm its operability. Now that the design aspects are complete, the later-in-time operational aspects need to be addressed for the man using the process system. Training personnel in operating and testing the process system is as important as the original design. To accomplish these activities, documents are prepared to instruct personnel how to operate (and test) the system under a variety of circumstances

  14. Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study

    Directory of Open Access Journals (Sweden)

    Schneider Craig

    2007-04-01

    Full Text Available Abstract Background As more integrative medicine educational content is integrated into conventional family medicine teaching, the need for effective evaluation strategies grows. Through the Integrative Family Medicine program, a six site pilot program of a four year residency training model combining integrative medicine and family medicine training, we have developed and tested a set of competency-based evaluation tools to assess residents' skills in integrative medicine history-taking and treatment planning. This paper presents the results from the implementation of direct observation and treatment plan evaluation tools, as well as the results of two Objective Structured Clinical Examinations (OSCEs developed for the program. Methods The direct observation (DO and treatment plan (TP evaluation tools developed for the IFM program were implemented by faculty at each of the six sites during the PGY-4 year (n = 11 on DO and n = 8 on TP. The OSCE I was implemented first in 2005 (n = 6, revised and then implemented with a second class of IFM participants in 2006 (n = 7. OSCE II was implemented in fall 2005 with only one class of IFM participants (n = 6. Data from the initial implementation of these tools are described using descriptive statistics. Results Results from the implementation of these tools at the IFM sites suggest that we need more emphasis in our curriculum on incorporating spirituality into history-taking and treatment planning, and more training for IFM residents on effective assessment of readiness for change and strategies for delivering integrative medicine treatment recommendations. Focusing our OSCE assessment more narrowly on integrative medicine history-taking skills was much more effective in delineating strengths and weaknesses in our residents' performance than using the OSCE for both integrative and more basic communication competencies. Conclusion As these tools are refined further they will be of value both in improving

  15. West-Life, Tools for Integrative Structural Biology

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Structural biology is part of molecular biology focusing on determining structure of macromolecules inside living cells and cell membranes. As macromolecules determines most of the functions of cells the structural knowledge is very useful for further research in metabolism, physiology to application in pharmacology etc. As macromolecules are too small to be observed directly by light microscope, there are other methods used to determine the structure including nuclear magnetic resonance (NMR), X-Ray crystalography, cryo electron microscopy and others. Each method has it's advantages and disadvantages in the terms of availability, sample preparation, resolution. West-Life project has ambition to facilitate integrative approach using multiple techniques mentioned above. As there are already lot of software tools to process data produced by the techniques above, the challenge is to integrate them together in a way they can be used by experts in one technique but not experts in other techniques. One product ...

  16. Intervene: a tool for intersection and visualization of multiple gene or genomic region sets.

    Science.gov (United States)

    Khan, Aziz; Mathelier, Anthony

    2017-05-31

    A common task for scientists relies on comparing lists of genes or genomic regions derived from high-throughput sequencing experiments. While several tools exist to intersect and visualize sets of genes, similar tools dedicated to the visualization of genomic region sets are currently limited. To address this gap, we have developed the Intervene tool, which provides an easy and automated interface for the effective intersection and visualization of genomic region or list sets, thus facilitating their analysis and interpretation. Intervene contains three modules: venn to generate Venn diagrams of up to six sets, upset to generate UpSet plots of multiple sets, and pairwise to compute and visualize intersections of multiple sets as clustered heat maps. Intervene, and its interactive web ShinyApp companion, generate publication-quality figures for the interpretation of genomic region and list sets. Intervene and its web application companion provide an easy command line and an interactive web interface to compute intersections of multiple genomic and list sets. They have the capacity to plot intersections using easy-to-interpret visual approaches. Intervene is developed and designed to meet the needs of both computer scientists and biologists. The source code is freely available at https://bitbucket.org/CBGR/intervene , with the web application available at https://asntech.shinyapps.io/intervene .

  17. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  18. miRQuest: integration of tools on a Web server for microRNA research.

    Science.gov (United States)

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-03-28

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.

  19. Integration of distributed system simulation tools for a holistic approach to integrated building and system design

    NARCIS (Netherlands)

    Radosevic, M.; Hensen, J.L.M.; Wijsman, A.J.T.M.; Hensen, J.L.M.; Lain, M.

    2004-01-01

    Advanced architectural developments require an integrated approach to design where simulation tools available today deal. only with a small subset of the overall problem. The aim of this study is to enable run time exchange of necessary data at suitable frequency between different simulation

  20. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  1. Picante: R tools for integrating phylogenies and ecology.

    Science.gov (United States)

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  2. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    Science.gov (United States)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  3. Bond graphs : an integrating tool for design of mechatronic systems

    International Nuclear Information System (INIS)

    Ould Bouamama, B.

    2011-01-01

    Bond graph is a powerful tool well known for dynamic modelling of multi physical systems: This is the only modelling technique to generate automatically state space or non-linear models using dedicated software tools (CAMP-G, 20-Sim, Symbols, Dymola...). Recently several fundamental theories have been developed for using a bond graph model not only for modeling but also as a real integrated tool from conceptual ideas to optimal practical realization of mechatronic system. This keynote presents a synthesis of those new theories which exploit some particular properties (such as causal, structural and behavioral) of this graphical methodology. Based on a pedagogical example, it will be shown how from a physical system (not a transfer function or state equation) and using only one representation (Bond graph), the following results can be performed: modeling (formal state equations generation), Control analysis (observability, controllability, Structural I/O decouplability, dynamic decoupling,...) diagnosis analysis (automatic generation of robust fault indicators, sensor placement, structural diagnosability) and finally sizing of actuators. The presentation will be illustrated by real industrial applications. Limits and perspectives of bond graph theory conclude the keynote.

  4. Setting the right path and pace for integration.

    Science.gov (United States)

    Cwiek, Katherine A; Inniger, Meredith C; Zismer, Daniel K

    2014-04-01

    Far from being a monolithic trend, integration in health care today is progressing in various forms, and at different rates in different markets within and across the range of healthcare organizations. Each organization should develop a tailored strategy that delineates the level and type of integration it will pursue and at what pace to pursue it. This effort will require evaluation of external market conditions with respect to integration and competition and a candid assessment of intraorganizational integration. The compared results of the two analyses will provide the basis for formulating strategy.

  5. Tools and approaches for simplifying serious games development in educational settings

    OpenAIRE

    Calvo, Antonio; Rotaru, Dan C.; Freire, Manuel; Fernandez-Manjon, Baltasar

    2016-01-01

    Serious Games can benefit from the commercial video games industry by taking advantage of current development tools. However, the economics and requirements of serious games and commercial games are very different. In this paper, we describe the factors that impact the total cost of ownership of serious games used in educational settings, review the specific requirements of games used as learning material, and analyze the different development tools available in the industry highlighting thei...

  6. Integration of oncology and palliative care: setting a benchmark.

    Science.gov (United States)

    Vayne-Bossert, P; Richard, E; Good, P; Sullivan, K; Hardy, J R

    2017-10-01

    Integration of oncology and palliative care (PC) should be the standard model of care for patients with advanced cancer. An expert panel developed criteria that constitute integration. This study determined whether the PC service within this Health Service, which is considered to be fully "integrated", could be benchmarked against these criteria. A survey was undertaken to determine the perceived level of integration of oncology and palliative care by all health care professionals (HCPs) within our cancer centre. An objective determination of integration was obtained from chart reviews of deceased patients. Integration was defined as >70% of all respondents answered "agree" or "strongly agree" to each indicator and >70% of patient charts supported each criteria. Thirty-four HCPs participated in the survey (response rate 69%). Over 90% were aware of the outpatient PC clinic, interdisciplinary and consultation team, PC senior leadership, and the acceptance of concurrent anticancer therapy. None of the other criteria met the 70% agreement mark but many respondents lacked the necessary knowledge to respond. The chart review included 67 patients, 92% of whom were seen by the PC team prior to death. The median time from referral to death was 103 days (range 0-1347). The level of agreement across all criteria was below our predefined definition of integration. The integration criteria relating to service delivery are medically focused and do not lend themselves to interdisciplinary review. The objective criteria can be audited and serve both as a benchmark and a basis for improvement activities.

  7. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Science.gov (United States)

    Alves, Mara L.; Brites, Cláudia; Paulo, Manuel; Carbas, Bruna; Belo, Maria; Mendes-Moreira, Pedro M. R.; Brites, Carla; Bronze, Maria do Rosário; Gunjača, Jerko; Šatović, Zlatko; Vaz Patto, Maria C.

    2017-01-01

    Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber), flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds). These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI) model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds) could still be found. Regarding the agronomic performance, farmers' maize populations

  8. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Directory of Open Access Journals (Sweden)

    Mara L. Alves

    2017-12-01

    Full Text Available Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber, flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds. These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds could still be found. Regarding the agronomic performance, farmers

  9. Validating the WHO maternal near miss tool: comparing high- and low-resource settings.

    Science.gov (United States)

    Witteveen, Tom; Bezstarosti, Hans; de Koning, Ilona; Nelissen, Ellen; Bloemenkamp, Kitty W; van Roosmalen, Jos; van den Akker, Thomas

    2017-06-19

    WHO proposed the WHO Maternal Near Miss (MNM) tool, classifying women according to several (potentially) life-threatening conditions, to monitor and improve quality of obstetric care. The objective of this study is to analyse merged data of one high- and two low-resource settings where this tool was applied and test whether the tool may be suitable for comparing severe maternal outcome (SMO) between these settings. Using three cohort studies that included SMO cases, during two-year time frames in the Netherlands, Tanzania and Malawi we reassessed all SMO cases (as defined by the original studies) with the WHO MNM tool (five disease-, four intervention- and seven organ dysfunction-based criteria). Main outcome measures were prevalence of MNM criteria and case fatality rates (CFR). A total of 3172 women were studied; 2538 (80.0%) from the Netherlands, 248 (7.8%) from Tanzania and 386 (12.2%) from Malawi. Total SMO detection was 2767 (87.2%) for disease-based criteria, 2504 (78.9%) for intervention-based criteria and 1211 (38.2%) for organ dysfunction-based criteria. Including every woman who received ≥1 unit of blood in low-resource settings as life-threatening, as defined by organ dysfunction criteria, led to more equally distributed populations. In one third of all Dutch and Malawian maternal death cases, organ dysfunction criteria could not be identified from medical records. Applying solely organ dysfunction-based criteria may lead to underreporting of SMO. Therefore, a tool based on defining MNM only upon establishing organ failure is of limited use for comparing settings with varying resources. In low-resource settings, lowering the threshold of transfused units of blood leads to a higher detection rate of MNM. We recommend refined disease-based criteria, accompanied by a limited set of intervention- and organ dysfunction-based criteria to set a measure of severity.

  10. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    Science.gov (United States)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  11. Introduction to Integrative Medicine in the Primary Care Setting.

    Science.gov (United States)

    Ring, Melinda; Mahadevan, Rupa

    2017-06-01

    Integrative Medicine has been described as "healing oriented medicine that takes account of the whole person (body, mind, and spirit) including all aspects of lifestyle. It emphasizes therapeutic relationships and makes use of all appropriate therapies, both conventional and alternative." National surveys consistently report that approximately one-third of adults and 12% of children use complementary and integrative medicine approaches. Although there are barriers to primary care professionals engaging in discussions about lifestyle change and complementary and integrative medicine options, there is also great potential to impact patient well-being. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. An integrated audio-visual impact tool for wind turbine installations

    International Nuclear Information System (INIS)

    Lymberopoulos, N.; Belessis, M.; Wood, M.; Voutsinas, S.

    1996-01-01

    An integrated software tool was developed for the design of wind parks that takes into account their visual and audio impact. The application is built on a powerful hardware platform and is fully operated through a graphic user interface. The topography, the wind turbines and the daylight conditions are realised digitally. The wind park can be animated in real time and the user can take virtual walks in it while the set-up of the park can be altered interactively. In parallel, the wind speed levels on the terrain, the emitted noise intensity, the annual energy output and the cash flow can be estimated at any stage of the session and prompt the user for rearrangements. The tool has been used to visually simulate existing wind parks in St. Breok, UK and Andros Island, Greece. The results lead to the conclusion that such a tool can assist to the public acceptance and licensing procedures of wind parks. (author)

  13. Integration of educational methods and physical settings: Design ...

    African Journals Online (AJOL)

    ... setting without having an architectural background. The theoretical framework of the research allows designers to consider key features and users' possible activities in High/ Scope settings and shape their designs accordingly. Keywords: daily activity; design; High/Scope education; interior space; teaching method ...

  14. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  15. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  16. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    Science.gov (United States)

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  17. Integration of tools for binding archetypes to SNOMED CT.

    Science.gov (United States)

    Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan

    2008-10-27

    The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.

  18. Using Plickers as an Assessment Tool in Health and Physical Education Settings

    Science.gov (United States)

    Chng, Lena; Gurvitch, Rachel

    2018-01-01

    Written tests are one of the most common assessment tools classroom teachers use today. Despite its popularity, administering written tests or surveys, especially in health and physical education settings, is time consuming. In addition to the time taken to type and print out the tests or surveys, health and physical education teachers must grade…

  19. Am I getting an accurate picture: a tool to assess clinical handover in remote settings?

    Directory of Open Access Journals (Sweden)

    Malcolm Moore

    2017-11-01

    Full Text Available Abstract Background Good clinical handover is critical to safe medical care. Little research has investigated handover in rural settings. In a remote setting where nurses and medical students give telephone handover to an aeromedical retrieval service, we developed a tool by which the receiving clinician might assess the handover; and investigated factors impacting on the reliability and validity of that assessment. Methods Researchers consulted with clinicians to develop an assessment tool, based on the ISBAR handover framework, combining validity evidence and the existing literature. The tool was applied ‘live’ by receiving clinicians and from recorded handovers by academic assessors. The tool’s performance was analysed using generalisability theory. Receiving clinicians and assessors provided feedback. Results Reliability for assessing a call was good (G = 0.73 with 4 assessments. The scale had a single factor structure with good internal consistency (Cronbach’s alpha = 0.8. The group mean for the global score for nurses and students was 2.30 (SD 0.85 out of a maximum 3.0, with no difference between these sub-groups. Conclusions We have developed and evaluated a tool to assess high-stakes handover in a remote setting. It showed good reliability and was easy for working clinicians to use. Further investigation and use is warranted beyond this setting.

  20. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    Science.gov (United States)

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  1. A set of tools for determining the LAT performance in specific applications

    International Nuclear Information System (INIS)

    Lott, B.; Ballet, J.; Chiang, J.; Lonjou, V.; Funk, S.

    2007-01-01

    The poster presents a set of simple tools being developed to predict GLAST's performance for specific cases, like the accumulation time needed to reach a given significance or statistical accuracy for a particular source. Different examples are given, like the generation of a full-sky sensitivity map

  2. Level Set Structure of an Integrable Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Taichiro Takagi

    2010-03-01

    Full Text Available Based on a group theoretical setting a sort of discrete dynamical system is constructed and applied to a combinatorial dynamical system defined on the set of certain Bethe ansatz related objects known as the rigged configurations. This system is then used to study a one-dimensional periodic cellular automaton related to discrete Toda lattice. It is shown for the first time that the level set of this cellular automaton is decomposed into connected components and every such component is a torus.

  3. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  4. National Integration in Multicultural School Setting in Malaysia

    Science.gov (United States)

    Nordin, Abu Bakar; Alias, Norlidah; Siraj, Saedah

    2013-01-01

    Malaysia is a multicultural country constituting three major ethno-cultural groups, Malay and Bumiputera, Chinese and Indian. Owing to its diverse cultures attempts through a number of channels, politics, economics and social were made to bring about national integration. School is thought to be the most effective platform to bring about national…

  5. Wari Construction Set Integrating Technology with Multicultural Mathematics.

    Science.gov (United States)

    Fowler, David

    1996-01-01

    Describes a Hypercard stack for playing one of many versions of the African game wari. Students can design their own variations of the game by determining the initial number of pieces and the number of pieces required for a capture. A list of activities related to the program and some recommendations about the integration of technology into…

  6. Setting up measuring campaigns for integrated wastewater modelling

    NARCIS (Netherlands)

    Vanrolleghem, P.A.; Schilling, W.; Rauch, W.; Krebs, P.; Aalderink, R.H.

    1999-01-01

    The steps of calibration/confirmation of models in a suggested 11-step procedure for analysis, planning and implementation of integrated urban wastewater management systems is focused upon in this paper. Based on ample experience obtained in comprehensive investigations throughout Europe

  7. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  8. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    The recently adopted Sustainable Development Goals (SDGs) - a set of 17 measurable and time-bound goals with 169 associated targets for 2030 - are highly inclusive challenges before the world community ranging from eliminating poverty to human rights, inequality, a secure world and protection of the environment. Each individual goal or target by themselves present enormous tasks, taken together they are overwhelming. There strong and weak interlinkages, hence trade-offs and complementarities among goals and targets. Some targets may affect several goals while other goals and targets may conflict or be mutually exclusive (Ref). Meeting each of these requires the judicious exploitation of resource, with energy playing an important role. Such complexity demands to be addressed in an integrated way using systems analysis tools to support informed policy formulation, planning, allocation of scarce resources, monitoring progress, effectiveness and review at different scales. There is no one size fits all methodology that conceivably could include all goal and targets simultaneously. But there are methodologies encapsulating critical subsets of the goal and targets with strong interlinkages with a 'soft' reflection on the weak interlinkages. Universal food security or sustainable energy for all inherently support goals and targets on human rights and equality but possibly at the cost of biodiversity or desertification. Integrated analysis and planning tools are not yet commonplace at national universities - or indeed in many policy making organs. What is needed is a fundamental realignment of institutions and integrations of their planning processes and decision making. We introduce a series of open source tools to support the SDG planning and implementation process. The Global User-friendly CLEW Open Source (GLUCOSE) tool optimizes resource interactions and constraints; The Global Electrification Tool kit (GETit) provides the first global spatially explicit

  9. Tools for integrating environmental objectives into policy and practice: What works where?

    Energy Technology Data Exchange (ETDEWEB)

    Runhaar, Hens

    2016-07-15

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  10. Tools for integrating environmental objectives into policy and practice: What works where?

    International Nuclear Information System (INIS)

    Runhaar, Hens

    2016-01-01

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  11. Setting up measuring campaigns for integrated wastewater modelling

    DEFF Research Database (Denmark)

    Vanrolleghem, P.A.; Schilling, W.; Rauch, Wolfgang

    1999-01-01

    The steps of calibration/confirmation of models in a suggested Ii-step procedure far analysis, planning and implementation of integrated urban wastewater management systems is focused upon in this paper. Based on ample experience obtained in comprehensive investigations throughout Europe recommen...... problems related to suspended solids, specific contaminants, hygienic hazards and total pollutant loss illustrate the recommendations presented. (C) 1999 IAWQ Published by Elsevier Science Ltd. All rights reserved....

  12. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  13. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  14. Tool life and surface integrity aspects when drilling nickel alloy

    Science.gov (United States)

    Kannan, S.; Pervaiz, S.; Vincent, S.; Karthikeyan, R.

    2018-04-01

    . Overall the results indicate that the effect of drilling and milling parameters is most marked in terms of surface quality in the circumferential direction. Material removal rates and tool flank wear must be maintained within the control limits to maintain hole integrity.

  15. Taming the data wilderness with the VHO: Integrating heliospheric data sets

    Science.gov (United States)

    Schroeder, P.; Szabo, A.; Narock, T.

    Currently space physicists are faced with a bewildering array of heliospheric missions experiments and data sets available at archives distributed around the world Daunting even for those most familiar with the field physicists in other concentrations solar physics magnetospheric physics etc find locating the heliospheric data that they need extremely challenging if not impossible The Virtual Heliospheric Observatory VHO will help to solve this problem by creating an Application Programming Interface API and web portal that integrates these data sets to find the highest quality data for a given task The VHO will locate the best available data often found only at PI institutions rather than at national archives like the NSSDC The VHO will therefore facilitate a dynamic data environment where improved data products are made available immediately In order to accomplish this the VHO will enforce a metadata standard on participating data providers with sufficient depth to allow for meaningful scientific evaluation of similar data products The VHO will provide an automated way for secondary sites to keep mirrors of data archives up to date and encouraging the generation of secondary or added-value data products The VHO will interact seamlessly with the Virtual Solar Observatory VSO and other Virtual Observatories VxO s to allow for inter-disciplinary data searching Software tools for these data sets will also be available through the VHO Finally the VHO will provide linkages to the modeling community and will develop metadata standards for the

  16. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    Science.gov (United States)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  17. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    LENUS (Irish Health Repository)

    Hennerby, Cathy

    2012-02-01

    AIM: This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. BACKGROUND: The increased number of registered general agency nurses working in an acute children\\'s hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about \\'near misses\\

  18. AORN Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings.

    Science.gov (United States)

    Hughes, Nancy L; Nelson, Audrey; Matz, Mary W; Lloyd, John

    2011-06-01

    Prolonged standing during surgical procedures poses a high risk of causing musculoskeletal disorders, including back, leg, and foot pain, which can be chronic or acute in nature. Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings provides recommendations for relieving the strain of prolonged standing, including the use of antifatigue mats, supportive footwear, and sit/stand stools, that are based on well-accepted ergonomic safety concepts, current research, and access to new and emerging technology. Published by Elsevier Inc.

  19. Setting and solving several factorization problems for integral operators

    International Nuclear Information System (INIS)

    Engibaryan, N B

    2000-01-01

    The problem of factorization I-K=(I-U - )(I-U + ), is considered. Here I is the identity operator, K is a fixed integral operator of Fredholm type: (Kf)(x)=∫ a b k(x,t)f(t)dt, -∞≤a ± are unknown upper and lower Volterra operators. Classes of generalized Volterra operators U ± are introduced such that I-U ± are not necessarily invertible operators in the spaces of functions on (a,b) under consideration. A combination of the method of non-linear factorization equations and a priori estimates brings forth new results on the existence and properties of the solution to this problem for k≥0, both in the subcritical case μ + and U - vanish on some parts S - and S + of the domain S=(a,b) 2 such that S + union S - =S

  20. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  1. Utilizing 4-H in Afterschool Settings: Two Approaches for Integration

    Directory of Open Access Journals (Sweden)

    Rachel Rudd

    2013-03-01

    Full Text Available As our communities grow and change, afterschool programs represent an avenue to bring resources to populations which would otherwise not be available to them. Combining 4-H with the afterschool environment can be beneficial in supporting and raising the quality of afterschool programs being offered. This article explores the benefits and challenges of two approaches of implementing 4-H programming in afterschool settings: the 4-H managed program that is created and run solely by 4-H faculty and staff and the 4-H afterschool partnerships which are facilitated in partnership with existing afterschool programs. Regardless of the approach, combining 4-H with afterschool programs can strengthen well established programs and can enhance the quality of all afterschool programs.

  2. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  3. Surgical Technology Integration with Tools for Cognitive Human Factors (STITCH)

    Science.gov (United States)

    2010-10-01

    Measurement Tool We conducted another round of data collection using the daVinci Surgical System at the University of Kentucky Hospital in May. In this...9 3. Tools and Display Technology...considering cognitive and environmental factors such as mental workload, stress, situation awareness, and level of comfort with complex tools . To

  4. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Directory of Open Access Journals (Sweden)

    Esther Suter

    2017-11-01

    Full Text Available Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework.

  5. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Science.gov (United States)

    Oelke, Nelly D.; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana

    2017-01-01

    Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. PMID:29588637

  6. Indicators and measurement tools for health system integration: a knowledge synthesis protocol.

    Science.gov (United States)

    Oelke, Nelly D; Suter, Esther; da Silva Lima, Maria Alice Dias; Van Vliet-Brown, Cheryl

    2015-07-29

    accessible set of indicators and tools to measure health system integration across different contexts and cultures. Being able to evaluate the success of integration strategies and initiatives will lead to better health system design and improved health outcomes for patients.

  7. Integrating neuroinformatics tools in TheVirtualBrain.

    Science.gov (United States)

    Woodman, M Marmaduke; Pezard, Laurent; Domide, Lia; Knock, Stuart A; Sanz-Leon, Paula; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor

    2014-01-01

    TheVirtualBrain (TVB) is a neuroinformatics Python package representing the convergence of clinical, systems, and theoretical neuroscience in the analysis, visualization and modeling of neural and neuroimaging dynamics. TVB is composed of a flexible simulator for neural dynamics measured across scales from local populations to large-scale dynamics measured by electroencephalography (EEG), magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI), and core analytic and visualization functions, all accessible through a web browser user interface. A datatype system modeling neuroscientific data ties together these pieces with persistent data storage, based on a combination of SQL and HDF5. These datatypes combine with adapters allowing TVB to integrate other algorithms or computational systems. TVB provides infrastructure for multiple projects and multiple users, possibly participating under multiple roles. For example, a clinician might import patient data to identify several potential lesion points in the patient's connectome. A modeler, working on the same project, tests these points for viability through whole brain simulation, based on the patient's connectome, and subsequent analysis of dynamical features. TVB also drives research forward: the simulator itself represents the culmination of several simulation frameworks in the modeling literature. The availability of the numerical methods, set of neural mass models and forward solutions allows for the construction of a wide range of brain-scale simulation scenarios. This paper briefly outlines the history and motivation for TVB, describing the framework and simulator, giving usage examples in the web UI and Python scripting.

  8. Integrating neuroinformatics tools in TheVirtualBrain

    Directory of Open Access Journals (Sweden)

    M Marmaduke Woodman

    2014-04-01

    Full Text Available TheVirtualBrain (TVB is a neuroinformatics Python package representing theconvergence of clinical, systems, and theoretical neuroscience in the analysis,visualization and modeling of neural and neuroimaging dynamics. TVB iscomposed of a flexible simulator for neural dynamics measured across scalesfrom local populations to large-scale dynamics measured byelectroencephalography (EEG, magnetoencephalography (MEG and functionalmagnetic resonance imaging (fMRI, and core analytic and visualizationfunctions, all accessible through a web browser user interface. A datatypesystem modeling neuroscientific data ties together these pieces with persistentdata storage, based on a combination of SQL & HDF5. These datatypes combinewith adapters allowing TVB to integrate other algorithms or computationalsystems. TVB provides infrastructure for multiple projects and multiple users,possibly participating under multiple roles. For example, a clinician mightimport patient data to identify several potential lesion points in thepatient's connectome. A modeler, working on the same project, tests thesepoints for viability through whole brain simulation, based on the patient'sconnectome, and subsequent analysis of dynamical features. TVB also drivesresearch forward: the simulator itself represents the culmination of severalsimulation frameworks in the modeling literature. The availability of thenumerical methods, set of neural mass models and forward solutions allows forthe construction of a wide range of brain-scale simulation scenarios. Thispaper briefly outlines the history and motivation for TVB, describing theframework and simulator, giving usage examples in the web UI and Pythonscripting.

  9. A Python tool to set up relative free energy calculations in GROMACS.

    Science.gov (United States)

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  10. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  11. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    Science.gov (United States)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  12. TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools

    Science.gov (United States)

    Marlowe, Jill M.; Dixon, Genevieve D.

    1998-01-01

    This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.

  13. Application of fuzzy set theory for integral assessment of agricultural products quality

    Science.gov (United States)

    Derkanosova, N. M.; Ponomareva, I. N.; Shurshikova, G. V.; Vasilenko, O. A.

    2018-05-01

    The methodology of integrated assessment of quality and safety of agricultural products, approbated by the example of indicators of wheat grain in relation to the provision of consumer properties of bakery products, was developed. Determination of the level of quality of the raw ingredients will allow direct using of agricultural raw materials for food production, taking into account ongoing technology, types of products, and, respectively, rational use of resource potential of the agricultural sector. The mathematical tool of the proposed method is a fuzzy set theory. The fuzzy classifier to evaluate the properties of the grain is formed. The set of six indicators normalized by the national standard is determined; values are ordered and represented by linguistic variables with a trapeziform membership function; the rules for calculation of membership functions are presented. Specific criteria values for individual indicators in shaping the quality of the finished products are considered. For one of the samples of wheat grain values of membership; functions of the linguistic variable "level" for all indicators and the linguistic variable "level of quality" were calculated. It is established that the studied sample of grain obtains the 2 (average) level of quality. Accordingly, it can be recommended for the production of bakery products with higher requirements for the structural-mechanical properties bakery and puff pastry products hearth bread and flour confectionery products of the group of hard dough cookies and crackers

  14. Developing health science students into integrated health professionals: a practical tool for learning

    Directory of Open Access Journals (Sweden)

    Duncan Madeleine

    2007-11-01

    Full Text Available Abstract Background An integrated sense of professionalism enables health professionals to draw on relevant knowledge in context and to apply a set of professional responsibilities and ethical principles in the midst of changing work environments 12. Inculcating professionalism is therefore a critical goal of health professional education. Two multi-professional courses for first year Health Science students at the University of Cape Town, South Africa aim to lay the foundation for becoming an integrated health professional 3. In these courses a diagram depicting the domains of the integrated health professional is used to focus the content of small group experiential exercises towards an appreciation of professionalism. The diagram serves as an organising framework for conceptualising an emerging professional identity and for directing learning towards the domains of 'self as professional' 45. Objective This paper describes how a diagrammatic representation of the core elements of an integrated health professional is used as a template for framing course content and for organising student learning. Based on the assumption that all health care professionals should be knowledgeable, empathic and reflective, the diagram provides students and educators with a visual tool for investigating the subjective and objective dimensions of professionalism. The use of the diagram as an integrating point of reference for individual and small group learning is described and substantiated with relevant literature. Conclusion The authors have applied the diagram with positive impact for the past six years with students and educators reporting that "it just makes sense". The article includes plans for formal evaluation. Evaluation to date is based on preliminary, informal feedback on the value of the diagram as a tool for capturing the domains of professionalism at an early stage in the undergraduate education of health professional students.

  15. Rapid HIS, RIS, PACS Integration Using Graphical CASE Tools

    Science.gov (United States)

    Taira, Ricky K.; Breant, Claudine M.; Stepczyk, Frank M.; Kho, Hwa T.; Valentino, Daniel J.; Tashima, Gregory H.; Materna, Anthony T.

    1994-05-01

    We describe the clinical requirements of the integrated federation of databases and present our client-mediator-server design. The main body of the paper describes five important aspects of integrating information systems: (1) global schema design, (2) establishing sessions with remote database servers, (3) development of schema translators, (4) integration of global system triggers, and (5) development of job workflow scripts.

  16. Freiburg RNA Tools: a web server integrating INTARNA, EXPARNA and LOCARNA.

    Science.gov (United States)

    Smith, Cameron; Heyne, Steffen; Richter, Andreas S; Will, Sebastian; Backofen, Rolf

    2010-07-01

    The Freiburg RNA tools web server integrates three tools for the advanced analysis of RNA in a common web-based user interface. The tools IntaRNA, ExpaRNA and LocARNA support the prediction of RNA-RNA interaction, exact RNA matching and alignment of RNA, respectively. The Freiburg RNA tools web server and the software packages of the stand-alone tools are freely accessible at http://rna.informatik.uni-freiburg.de.

  17. Integrating best evidence into patient care: a process facilitated by a seamless integration with informatics tools.

    Science.gov (United States)

    Giuse, Nunzia B; Williams, Annette M; Giuse, Dario A

    2010-07-01

    The Vanderbilt University paper discusses how the Eskind Biomedical Library at Vanderbilt University Medical Center transitioned from a simplistic approach that linked resources to the institutional electronic medical record system, StarPanel, to a value-added service that is designed to deliver highly relevant information. Clinical teams formulate complex patient-specific questions via an evidence-based medicine literature request basket linked to individual patient records. The paper transitions into discussing how the StarPanel approach acted as a springboard for two additional projects that use highly trained knowledge management librarians with informatics expertise to integrate evidence into both order sets and a patient portal, MyHealth@Vanderbilt.

  18. Instructor's Perceptions towards the Use of an Online Instructional Tool in an Academic English Setting in Kuwait

    Science.gov (United States)

    Erguvan, Deniz

    2014-01-01

    This study sets out to explore the faculty members' perceptions of a specific web-based instruction tool (Achieve3000) in a private higher education institute in Kuwait. The online tool provides highly differentiated instruction, which is initiated with a level set at the beginning of the term. The program is used in two consecutive courses as…

  19. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... a condition for the minimum tool wear for this micro-EDM process configuration....

  20. Engineering a mobile health tool for resource-poor settings to assess and manage cardiovascular disease risk: SMARThealth study.

    Science.gov (United States)

    Raghu, Arvind; Praveen, Devarsetty; Peiris, David; Tarassenko, Lionel; Clifford, Gari

    2015-04-29

    The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized

  1. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    , but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...... of formerly disconnected tools could improve tool usability as well as decision maker productivity....

  2. A Nonparametric, Multiple Imputation-Based Method for the Retrospective Integration of Data Sets

    Science.gov (United States)

    Carrig, Madeline M.; Manrique-Vallier, Daniel; Ranby, Krista W.; Reiter, Jerome P.; Hoyle, Rick H.

    2015-01-01

    Complex research questions often cannot be addressed adequately with a single data set. One sensible alternative to the high cost and effort associated with the creation of large new data sets is to combine existing data sets containing variables related to the constructs of interest. The goal of the present research was to develop a flexible, broadly applicable approach to the integration of disparate data sets that is based on nonparametric multiple imputation and the collection of data from a convenient, de novo calibration sample. We demonstrate proof of concept for the approach by integrating three existing data sets containing items related to the extent of problematic alcohol use and associations with deviant peers. We discuss both necessary conditions for the approach to work well and potential strengths and weaknesses of the method compared to other data set integration approaches. PMID:26257437

  3. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  4. Peac – A set of tools to quickly enable Proof on a cluster

    International Nuclear Information System (INIS)

    Ganis, G; Vala, M

    2012-01-01

    With advent of the analysis phase of Lhcdata-processing, interest in Proof technology has considerably increased. While setting up a simple Proof cluster for basic usage is reasonably straightforward, exploiting the several new functionalities added in recent times may be complicated. Peac, standing for Proof Enabled Analysis Cluster, is a set of tools aiming to facilitate the setup and management of a Proof cluster. Peac is based on the experience made by setting up Proof for the Alice analysis facilities. It allows to easily build and configure Root and the additional software needed on the cluster, and may serve as distributor of binaries via Xrootd. Peac uses Proof-On-Demand (PoD) for resource management (start, stop or daemons). Finally, Peac sets-up and configures dataset management (using the Afdsmgrd daemon), as well as cluster monitoring (machine status and Proof query summaries) using MonAlisa. In this respect, a MonAlisa page has been dedicated to Peac users, so that a cluster managed by Peac can be automatically monitored. In this paper we present and describe the status and main components of Peac and show details about its usage.

  5. An integrated environment for developing object-oriented CAE tools

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, P.; Ryba, M.; Baitinger, U.G. [Integrated System Engeneering, Stuttgart (Germany)

    1996-12-31

    This paper presents how object oriented techniques can applied to improve the development of CAE tools. For the design of modular and reusable software systems we use predefined and well tested building blocks. These building blocks are reusable software components based on object-oriented technology which allows the assembling of software systems. Today CAE tools are typically very complex and computation extensive. Therefore we need a concept, that join the advantages of the object-oriented paradigm with the advantages of parallel and distributed programming. So we present a design environment for the development of concurrent-object oriented CAE tools called CoDO.

  6. Teaching Students How to Integrate and Assess Social Networking Tools in Marketing Communications

    Science.gov (United States)

    Schlee, Regina Pefanis; Harich, Katrin R.

    2013-01-01

    This research is based on two studies that focus on teaching students how to integrate and assess social networking tools in marketing communications. Study 1 examines how students in marketing classes utilize social networking tools and explores their attitudes regarding the use of such tools for marketing communications. Study 2 focuses on an…

  7. A review of computer tools for analysing the integration of renewable energy into various energy systems

    DEFF Research Database (Denmark)

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad

    2010-01-01

    to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...... of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...

  8. Integrating Technology Tools for Students Struggling with Written Language

    Science.gov (United States)

    Fedora, Pledger

    2015-01-01

    This exploratory study was designed to assess the experience of preservice teachers when integrating written language technology and their likelihood of applying that technology in their future classrooms. Results suggest that after experiencing technology integration, preservice teachers are more likely to use it in their future teaching.

  9. Evaluating quality of patient care communication in integrated care settings: a mixed methods apporach

    NARCIS (Netherlands)

    Gulmans, J.; Gulmans, J.; Vollenbroek-Hutten, Miriam Marie Rosé; van Gemert-Pijnen, Julia E.W.C.; van Harten, Willem H.

    2007-01-01

    Background. Owing to the involvement of multiple professionals from various institutions, integrated care settings are prone to suboptimal patient care communication. To assure continuity, communication gaps should be identified for targeted improvement initiatives. However, available assessment

  10. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  11. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    Science.gov (United States)

    Hennerby, Cathy; Joyce, Pauline

    2011-03-01

    This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. The increased number of registered general agency nurses working in an acute children's hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about 'near misses', parental dissatisfaction, perceived competency weaknesses and rising cost associated with their use. [Young's (2009) Journal of Organisational Change, 22, 524-548] nine-stage change framework was used to guide the implementation of the competency assessment tool within a paediatric acute care setting. The ongoing success of the initiative, from a nurse manager's perspective, relies on structured communication with the agency provider before employing competent agency nurses. Sustainability of the change will depend on nurse managers' persistence in attending the concerns of those resisting the change while simultaneously supporting those championing the change. These key communication and supporting roles highlight the pivotal role held by nurse managers, as gate keepers, in safe-guarding children while in hospital. Leadership qualities of nurse managers will also be challenged in continuing to manage and drive the change where resistance might prevail. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  12. Data integration through brain atlasing: Human Brain Project tools and strategies.

    Science.gov (United States)

    Bjerke, Ingvild E; Øvsthus, Martin; Papp, Eszter A; Yates, Sharon C; Silvestri, Ludovico; Fiorilli, Julien; Pennartz, Cyriel M A; Pavone, Francesco S; Puchades, Maja A; Leergaard, Trygve B; Bjaalie, Jan G

    2018-04-01

    The Human Brain Project (HBP), an EU Flagship Initiative, is currently building an infrastructure that will allow integration of large amounts of heterogeneous neuroscience data. The ultimate goal of the project is to develop a unified multi-level understanding of the brain and its diseases, and beyond this to emulate the computational capabilities of the brain. Reference atlases of the brain are one of the key components in this infrastructure. Based on a new generation of three-dimensional (3D) reference atlases, new solutions for analyzing and integrating brain data are being developed. HBP will build services for spatial query and analysis of brain data comparable to current online services for geospatial data. The services will provide interactive access to a wide range of data types that have information about anatomical location tied to them. The 3D volumetric nature of the brain, however, introduces a new level of complexity that requires a range of tools for making use of and interacting with the atlases. With such new tools, neuroscience research groups will be able to connect their data to atlas space, share their data through online data systems, and search and find other relevant data through the same systems. This new approach partly replaces earlier attempts to organize research data based only on a set of semantic terminologies describing the brain and its subdivisions. Copyright © 2018 The Authors. Published by Elsevier Masson SAS.. All rights reserved.

  13. Innovative R.E.A. tools for integrated bathymetric survey

    Science.gov (United States)

    Demarte, Maurizio; Ivaldi, Roberta; Sinapi, Luigi; Bruzzone, Gabriele; Caccia, Massimo; Odetti, Angelo; Fontanelli, Giacomo; Masini, Andrea; Simeone, Emilio

    2017-04-01

    The REA (Rapid Environmental Assessment) concept is a methodology finalized to acquire environmental information, process them and return in standard paper-chart or standard digital format. Acquired data become thus available for the ingestion or the valorization of the Civilian Protection Emergency Organization or the Rapid Response Forces. The use of Remotely Piloted Aircraft Systems (RPAS) with the miniaturization of multispectral camera or Hyperspectral camera gives to the operator the capability to react in a short time jointly with the capacity to collect a big amount of different data and to deliver a very large number of products. The proposed methodology incorporates data collected from remote and autonomous sensors that acquire data over areas in a cost-effective manner. The hyperspectral sensors are able to map seafloor morphology, seabed structure, depth of bottom surface and an estimate of sediment development. The considerable spectral portions are selected using an appropriate configuration of hyperspectral cameras to maximize the spectral resolution. Data acquired by hyperspectral camera are geo-referenced synchronously to an Attitude and Heading Reference Systems (AHRS) sensor. The data can be subjected to a first step on-board processing of the unmanned vehicle before be transferred through the Ground Control Station (GCS) to a Processing Exploitation Dissemination (PED) system. The recent introduction of Data Distribution Systems (DDS) capabilities in PED allow a cooperative distributed approach to modern decision making. Two platforms are used in our project, a Remote Piloted Aircraft (RPAS) and an Unmanned Surface Vehicle (USV). The two platforms mutually interact to cover a surveyed area wider than the ones that could be covered by the single vehicles. The USV, especially designed to work in very shallow water, has a modular structure and an open hardware and software architecture allowing for an easy installation and integration of various

  14. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis.

    Science.gov (United States)

    Lal, Aparna

    2016-02-02

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  15. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis

    Directory of Open Access Journals (Sweden)

    Aparna Lal

    2016-02-01

    Full Text Available Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  16. Improving beam set-up using an online beam optics tool

    International Nuclear Information System (INIS)

    Richter, S.; Barth, W.; Franczak, B.; Scheeler, U.; Wilms, D.

    2004-01-01

    The GSI accelerator facility [1] consists of the Universal Linear Accelerator (Unilac), the heavy ion synchrotron SIS, and the Experimental Storage Ring (ESR). Two Unilac injectors with three ion source terminals provide ion species from the lightest such as hydrogen up to uranium. The High Current Injector (HSI) for low charge state ion beams provides mostly high intense but short pulses, whereas the High Charge State Injector (HLI) supplies long pulses with a high duty factor of up to 27%. Before entering the Alvarez section of the Unilac the ion beam from the HSI is stripped in a supersonic gas jet. Up to three different ion species can be accelerated for up to five experiments in a time-sharing mode. Frequent changes of beam energy and intensity during a single beam time period may result in time consuming set-up and tuning especially of the beam transport lines. To shorten these changeover times an online optics tool (MIRKO EXPERT) had been developed. Based on online emittance measurements at well-defined locations the beam envelopes are calculated using the actual magnet settings. With this input improved calculated magnet settings can be directly sent to the magnet power supplies. The program reads profile grid measurements, such that an atomized beam alignment is established and that steering times are minimized. Experiences on this tool will be reported. At the Unilac a special focus is put on high current operation with short but intense beam pulses. Limitations like missing non-destructive beam diagnostics, insufficient longitudinal beam diagnostics, insufficient longitudinal beam matching, and influence of the hard edged model for magnetic fields will be discussed. Special attention will be put on the limits due to high current effects with bunched beams. (author)

  17. Data, models, and views: towards integration of diverse numerical model components and data sets for scientific and public dissemination

    Science.gov (United States)

    Hofmeister, Richard; Lemmen, Carsten; Nasermoaddeli, Hassan; Klingbeil, Knut; Wirtz, Kai

    2015-04-01

    Data and models for describing coastal systems span a diversity of disciplines, communities, ecosystems, regions and techniques. Previous attempts of unifying data exchange, coupling interfaces, or metadata information have not been successful. We introduce the new Modular System for Shelves and Coasts (MOSSCO, http://www.mossco.de), a novel coupling framework that enables the integration of a diverse array of models and data from different disciplines relating to coastal research. In the MOSSCO concept, the integrating framework imposes very few restrictions on contributed data or models; in fact, there is no distinction made between data and models. The few requirements are: (1) principle coupleability, i.e. access to I/O and timing information in submodels, which has recently been referred to as the Basic Model Interface (BMI) (2) open source/open data access and licencing and (3) communication of metadata, such as spatiotemporal information, naming conventions, and physical units. These requirements suffice to integrate different models and data sets into the MOSSCO infrastructure and subsequently built a modular integrated modeling tool that can span a diversity of processes and domains. We demonstrate how diverse coastal system constituents were integrated into this modular framework and how we deal with the diverging development of constituent data sets and models at external institutions. Finally, we show results from simulations with the fully coupled system using OGC WebServices in the WiMo geoportal (http://kofserver3.hzg.de/wimo), from where stakeholders can view the simulation results for further dissemination.

  18. Metadata and Tools for Integration and Preservation of Cultural Heritage 3D Information

    Directory of Open Access Journals (Sweden)

    Achille Felicetti

    2011-12-01

    Full Text Available In this paper we investigate many of the various storage, portability and interoperability issues arising among archaeologists and cultural heritage people when dealing with 3D technologies. On the one side, the available digital repositories look often unable to guarantee affordable features in the management of 3D models and their metadata; on the other side the nature of most of the available data format for 3D encoding seem to be not satisfactory for the necessary portability required nowadays by 3D information across different systems. We propose a set of possible solutions to show how integration can be achieved through the use of well known and wide accepted standards for data encoding and data storage. Using a set of 3D models acquired during various archaeological campaigns and a number of open source tools, we have implemented a straightforward encoding process to generate meaningful semantic data and metadata. We will also present the interoperability process carried out to integrate the encoded 3D models and the geographic features produced by the archaeologists. Finally we will report the preliminary (rather encouraging development of a semantic enabled and persistent digital repository, where 3D models (but also any kind of digital data and metadata can easily be stored, retrieved and shared with the content of other digital archives.

  19. Development of tools for integrated monitoring and assessment of hazardous substances and their biological effects in the Baltic Sea.

    Science.gov (United States)

    Lehtonen, Kari K; Sundelin, Brita; Lang, Thomas; Strand, Jakob

    2014-02-01

    The need to develop biological effects monitoring to facilitate a reliable assessment of hazardous substances has been emphasized in the Baltic Sea Action Plan of the Helsinki Commission. An integrated chemical-biological approach is vitally important for the understanding and proper assessment of anthropogenic pressures and their effects on the Baltic Sea. Such an approach is also necessary for prudent management aiming at safeguarding the sustainable use of ecosystem goods and Services. The BEAST project (Biological Effects of Anthropogenic Chemical Stress: Tools for the Assessment of Ecosystem Health) set out to address this topic within the BONUS Programme. BEAST generated a large amount of quality-assured data on several biological effects parameters (biomarkers) in various marine species in different sub-regions of the Baltic Sea. New indicators (biological response measurement methods) and management tools (integrated indices) with regard to the integrated monitoring approach were suggested.

  20. Interaction Between the Environment and Animals in Urban Settings: Integrated and Participatory Planning

    Science.gov (United States)

    Tarsitano, Elvira

    2006-11-01

    In urban ecosystems, the ecological system has become completely unbalanced; this, in turn, has led to an increase in well-known problems such as air pollution, ground pollution, and water pollution. This imbalance has also led to the growth and spread of pathogens harmful to man, animals, and plants. Urban sustainability indicators, both global and local, also “indicate” the percentage of population, but these refer only to the human population, not the animal population. Cities need good waste, water, and air management, effective traffic planning, and good zoning of businesses, crafts, and services; over and above these activities, cities also need for planning to take into account the existence of pets (dogs, cats, and etc.) and nonpet animals (insects, birds, mice, etc.). Cities tend to be designed around humans and “on a human scale,” without taking into account the fact that a huge animal population is living side by side with people. That explains why overcrowding tends to go hand in hand with urbanization; all these populations, including humans, need to adapt to new spaces and often need to drastically change their behavior. This is a fact that must be included when drafting sustainable city plans. The supposed strategy is that of “integrated-participatory” control of the interactions between the environment and animals in the cities. Strategy will focus on the development of integrated approaches and tools for environment and animal management in the context of urban settings. This will require such specific methods as ecological balance sheets and ecoplans for the planning, management, and control of the interrelation among environment, animal, and public health. The objective is to develop a better understanding of urban biodiversity and of urban ecosystem functioning, in order to understand and minimize the negative impacts of human activities on them. The research will focus on assessing and forecasting changes in urban biodiversity

  1. Effect of different machining processes on the tool surface integrity and fatigue life

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Chuan Liang [College of Mechanical and Electrical Engineering, Nanchang University, Nanchang (China); Zhang, Xianglin [School of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan (China)

    2016-08-15

    Ultra-precision grinding, wire-cut electro discharge machining and lapping are often used to machine the tools in fine blanking industry. And the surface integrity from these machining processes causes great concerns in the research field. To study the effect of processing surface integrity on the fine blanking tool life, the surface integrity of different tool materials under different processing conditions and its influence on fatigue life were thoroughly analyzed in the present study. The result shows that the surface integrity of different materials was quite different on the same processing condition. For the same tool material, the surface integrity on varying processing conditions was quite different too and deeply influenced the fatigue life.

  2. On Models with Uncountable Set of Spin Values on a Cayley Tree: Integral Equations

    International Nuclear Information System (INIS)

    Rozikov, Utkir A.; Eshkobilov, Yusup Kh.

    2010-01-01

    We consider models with nearest-neighbor interactions and with the set [0, 1] of spin values, on a Cayley tree of order k ≥ 1. We reduce the problem of describing the 'splitting Gibbs measures' of the model to the description of the solutions of some nonlinear integral equation. For k = 1 we show that the integral equation has a unique solution. In case k ≥ 2 some models (with the set [0, 1] of spin values) which have a unique splitting Gibbs measure are constructed. Also for the Potts model with uncountable set of spin values it is proven that there is unique splitting Gibbs measure.

  3. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  4. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  5. An Integrated Simulation Tool for Modeling the Human Circulatory System

    Science.gov (United States)

    Asami, Ken'ichi; Kitamura, Tadashi

    This paper presents an integrated simulation of the circulatory system in physiological movement. The large circulatory system model includes principal organs and functional units in modules in which comprehensive physiological changes such as nerve reflexes, temperature regulation, acid/base balance, O2/CO2 balance, and exercise are simulated. A beat-by-beat heart model, in which the corresponding electrical circuit problems are solved by a numerical analytic method, enables calculation of pulsatile blood flow to the major organs. The integration of different perspectives on physiological changes makes this simulation model applicable for the microscopic evaluation of blood flow under various conditions in the human body.

  6. Decoding the genome with an integrative analysis tool: combinatorial CRM Decoder.

    Science.gov (United States)

    Kang, Keunsoo; Kim, Joomyeong; Chung, Jae Hoon; Lee, Daeyoup

    2011-09-01

    The identification of genome-wide cis-regulatory modules (CRMs) and characterization of their associated epigenetic features are fundamental steps toward the understanding of gene regulatory networks. Although integrative analysis of available genome-wide information can provide new biological insights, the lack of novel methodologies has become a major bottleneck. Here, we present a comprehensive analysis tool called combinatorial CRM decoder (CCD), which utilizes the publicly available information to identify and characterize genome-wide CRMs in a species of interest. CCD first defines a set of the epigenetic features which is significantly associated with a set of known CRMs as a code called 'trace code', and subsequently uses the trace code to pinpoint putative CRMs throughout the genome. Using 61 genome-wide data sets obtained from 17 independent mouse studies, CCD successfully catalogued ∼12 600 CRMs (five distinct classes) including polycomb repressive complex 2 target sites as well as imprinting control regions. Interestingly, we discovered that ∼4% of the identified CRMs belong to at least two different classes named 'multi-functional CRM', suggesting their functional importance for regulating spatiotemporal gene expression. From these examples, we show that CCD can be applied to any potential genome-wide datasets and therefore will shed light on unveiling genome-wide CRMs in various species.

  7. Integrating Human Terrain reasoning and tooling in C2 systems

    NARCIS (Netherlands)

    Reus, N. de; Grand, N. le; Kwint, M.; Reniers , F.; Anthonie van Lieburg, A. van

    2010-01-01

    Within an operational staff the ‘core business’ of the Intelligence Cell is to initiate, collect, process, analyze and disseminate relevant information. This Intelligence Preparation of the Environment addresses the environmental evaluation, threat evaluation and results in an integrated overview of

  8. An Integrated Pest Management Tool for Evaluating Schools

    Science.gov (United States)

    Bennett, Blake; Hurley, Janet; Merchant, Mike

    2016-01-01

    Having the ability to assess pest problems in schools is essential for a successful integrated pest management (IPM) program. However, such expertise can be costly and is not available to all school districts across the United States. The web-based IPM Calculator was developed to address this problem. By answering questions about the condition of…

  9. Google Sets, Google Suggest, and Google Search History: Three More Tools for the Reference Librarian's Bag of Tricks

    OpenAIRE

    Cirasella, Jill

    2008-01-01

    This article examines the features, quirks, and uses of Google Sets, Google Suggest, and Google Search History and argues that these three lesser-known Google tools warrant inclusion in the resourceful reference librarian’s bag of tricks.

  10. Topology and boundary shape optimization as an integrated design tool

    Science.gov (United States)

    Bendsoe, Martin Philip; Rodrigues, Helder Carrico

    1990-01-01

    The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.

  11. Learning Asset Technology Integration Support Tool Design Document

    Science.gov (United States)

    2010-05-11

    language known as Hypertext Preprocessor ( PHP ) and by MySQL – a relational database management system that can also be used for content management. It...Requirements The LATIST tool will be implemented utilizing a WordPress platform with MySQL as the database. Also the LATIST system must effectively work... MySQL . When designing the LATIST system there are several considerations which must be accounted for in the working prototype. These include: • DAU

  12. Approaches, tools and methods used for setting priorities in health research in the 21(st) century.

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-06-01

    Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more

  13. Approaches, tools and methods used for setting priorities in health research in the 21st century

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be

  14. A Pragmatic Guide to the Setting up of Integrated Hypnotherapy Services in Primary Care and Clinical Settings.

    Science.gov (United States)

    Entwistle, Paul Andrew

    2017-01-01

    Despite the continued debate and lack of a clear consensus about the true nature of the hypnotic phenomenon, hypnosis is increasingly being utilized successfully in many medical, health, and psychological spheres as a research method, motivational tool, and therapeutic modality. Significantly, however, although hypnotherapy is widely advertised, advocated, and employed in the private medical arena for the management and treatment of many physical and emotional disorders, too little appears to be being done to integrate hypnosis into primary care and national health medical services. This article discusses some of the reasons for the apparent reluctance of medical and scientific health professionals to consider incorporating hypnosis into their medical practice, including the practical problems inherent in using hypnosis in a medical context and some possible solutions.

  15. Identification and Management of Eating Disorders in Integrated Primary Care: Recommendations for Psychologists in Integrated Care Settings.

    Science.gov (United States)

    Buchholz, Laura J; King, Paul R; Wray, Laura O

    2017-06-01

    Eating disorders are associated with deleterious health consequences, increased risk of mortality, and psychosocial impairment. Although individuals with eating disorders are likely to seek treatment in general medical settings such as primary care (PC), these conditions are often under-detected by PC providers. However, psychologists in integrated PC settings are likely to see patients with eating disorders because of the mental health comorbidities associated with these conditions. Further, due to their training in identifying risk factors associated with eating disorders (i.e., comorbid mental health and medical disorders) and opportunities for collaboration with PC providers, psychologists are well-positioned to improve the detection and management of eating disorders in PC. This paper provides a brief overview of eating disorders and practical guidance for psychologists working in integrated PC settings to facilitate the identification and management of these conditions.

  16. Property Integration: Componentless Design Techniques and Visualization Tools

    DEFF Research Database (Denmark)

    El-Halwagi, Mahmoud M; Glasgow, I.M.; Eden, Mario Richard

    2004-01-01

    integration is defined as a functionality-based, holistic approach to the allocation and manipulation of streams and processing units, which is based on tracking, adjusting, assigning, and matching functionalities throughout the process. Revised lever arm rules are devised to allow optimal allocation while...... maintaining intra- and interstream conservation of the property-based clusters. The property integration problem is mapped into the cluster domain. This dual problem is solved in terms of clusters and then mapped to the primal problem in the property domain. Several new rules are derived for graphical...... techniques. Particularly, systematic rules and visualization techniques for the identification of optimal mixing of streams and their allocation to units. Furthermore, a derivation of the correspondence between clustering arms and fractional contribution of streams is presented. This correspondence...

  17. Advanced ion trap structures with integrated tools for qubit manipulation

    Science.gov (United States)

    Sterk, J. D.; Benito, F.; Clark, C. R.; Haltli, R.; Highstrete, C.; Nordquist, C. D.; Scott, S.; Stevens, J. E.; Tabakov, B. P.; Tigges, C. P.; Moehring, D. L.; Stick, D.; Blain, M. G.

    2012-06-01

    We survey the ion trap fabrication technologies available at Sandia National Laboratories. These include four metal layers, precision backside etching, and low profile wirebonds. We demonstrate loading of ions in a variety of ion traps that utilize these technologies. Additionally, we present progress towards integration of on-board filtering with trench capacitors, photon collection via an optical cavity, and integrated microwave electrodes for localized hyperfine qubit control and magnetic field gradient quantum gates. [4pt] This work was supported by Sandia's Laboratory Directed Research and Development (LDRD) Program and the Intelligence Advanced Research Projects Activity (IARPA). Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the US Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  18. Offshore Wind Farm Clusters - Towards new integrated Design Tool

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Réthoré, Pierre-Elouan; Peña, Alfredo

    In EERA DTOC testing of existing wind farm wake models against four validation data test sets from large offshore wind farms is carried out. This includes Horns Rev-1 in the North Sea, Lillgrund in the Baltic Sea, Roedsand-2 in the Baltic Sea and from 10 large offshore wind farms in Northern Euro...

  19. Integrating Thermal Tools Into the Mechanical Design Process

    Science.gov (United States)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  20. Gsflow-py: An integrated hydrologic model development tool

    Science.gov (United States)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  1. The integration of FMEA with other problem solving tools: A review of enhancement opportunities

    Science.gov (United States)

    Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.

    2017-09-01

    Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.

  2. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  3. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  4. Process Improvement Through Tool Integration in Aero-Mechanical Design

    Science.gov (United States)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  5. Integrated simulation tools for collimation cleaning in HL-LHC

    CERN Document Server

    Bruce, R; Cerutti, F; Ferrari, A; Lechner, A; Marsili, A; Mirarchi, D; Ortega, P G; Redaelli, S; Rossi, A; Salvachua, B; Sinuela, D P; Tambasco, C; Vlachoudis, V; Mereghetti, A; Assmann, R; Lari, L; Gibson, S M; Nevay, LJ; Appleby, R B; Molson, J; Serluca, M; Barlow, R J; Rafique, H; Toader, A

    2014-01-01

    The Large Hadron Collider is designed to accommodate an unprecedented stored beam energy of 362 MJ in the nominal configuration and about the double in the high-luminosity upgrade HL-LHC that is presently under study. This requires an efficient collimation system to protect the superconducting magnets from quenches. During the design, it is therefore very important to accurately predict the expected beam loss distributions and cleaning efficiency. For this purpose, there are several ongoing efforts in improving the existing simulation tools or developing new ones. This paper gives a brief overview and status of the different available codes.

  6. Integrating information technologies as tools for surgical research.

    Science.gov (United States)

    Schell, Scott R

    2005-10-01

    Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.

  7. A vacuum microgripping tool with integrated vibration releasing capability

    Energy Technology Data Exchange (ETDEWEB)

    Rong, Weibin; Fan, Zenghua, E-mail: zenghua-fan@163.com; Wang, Lefeng; Xie, Hui; Sun, Lining [State Key Laboratory of Robotics and System, Harbin Institute of Technology, Harbin, Heilongjiang (China)

    2014-08-01

    Pick-and-place of micro-objects is a basic task in various micromanipulation demands. Reliable releasing of micro-objects is usually disturbed due to strong scale effects. This paper focuses on a vacuum micro-gripper with vibration releasing functionality, which was designed and assembled for reliable micromanipulation tasks. Accordingly, a vibration releasing strategy of implementing a piezoelectric actuator on the vacuum microgripping tool is presented to address the releasing problem. The releasing mechanism was illustrated using a dynamic micro contact model. This model was developed via theoretical analysis, simulations and pull-off force measurement using atomic force microscopy. Micromanipulation experiments were conducted to verify the performance of the vacuum micro-gripper. The results show that, with the assistance of the vibration releasing, the vacuum microgripping tool can achieve reliable release of micro-objects. A releasing location accuracy of 4.5±0.5 μm and a successful releasing rate of around 100% (which is based on 110 trials) were achieved for manipulating polystyrene microspheres with radius of 35–100 μm.

  8. A vacuum microgripping tool with integrated vibration releasing capability

    International Nuclear Information System (INIS)

    Rong, Weibin; Fan, Zenghua; Wang, Lefeng; Xie, Hui; Sun, Lining

    2014-01-01

    Pick-and-place of micro-objects is a basic task in various micromanipulation demands. Reliable releasing of micro-objects is usually disturbed due to strong scale effects. This paper focuses on a vacuum micro-gripper with vibration releasing functionality, which was designed and assembled for reliable micromanipulation tasks. Accordingly, a vibration releasing strategy of implementing a piezoelectric actuator on the vacuum microgripping tool is presented to address the releasing problem. The releasing mechanism was illustrated using a dynamic micro contact model. This model was developed via theoretical analysis, simulations and pull-off force measurement using atomic force microscopy. Micromanipulation experiments were conducted to verify the performance of the vacuum micro-gripper. The results show that, with the assistance of the vibration releasing, the vacuum microgripping tool can achieve reliable release of micro-objects. A releasing location accuracy of 4.5±0.5 μm and a successful releasing rate of around 100% (which is based on 110 trials) were achieved for manipulating polystyrene microspheres with radius of 35–100 μm

  9. The integrable case of Adler-van Moerbeke. Discriminant set and bifurcation diagram

    Science.gov (United States)

    Ryabov, Pavel E.; Oshemkov, Andrej A.; Sokolov, Sergei V.

    2016-09-01

    The Adler-van Moerbeke integrable case of the Euler equations on the Lie algebra so(4) is investigated. For the L- A pair found by Reyman and Semenov-Tian-Shansky for this system, we explicitly present a spectral curve and construct the corresponding discriminant set. The singularities of the Adler-van Moerbeke integrable case and its bifurcation diagram are discussed. We explicitly describe singular points of rank 0, determine their types, and show that the momentum mapping takes them to self-intersection points of the real part of the discriminant set. In particular, the described structure of singularities of the Adler-van Moerbeke integrable case shows that it is topologically different from the other known integrable cases on so(4).

  10. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Darlea, G L; Dumitru, I; Scannicchio, DA; Twomey, M S; Valsan, M L; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 PCs which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2 which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  11. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    International Nuclear Information System (INIS)

    Ballestrero, S; Darlea, G–L; Twomey, M S; Brasolin, F; Dumitru, I; Valsan, M L; Scannicchio, D A; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 systems which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2, which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  12. Integration of Web 2.0 Tools in Learning a Programming Course

    Science.gov (United States)

    Majid, Nazatul Aini Abd

    2014-01-01

    Web 2.0 tools are expected to assist students to acquire knowledge effectively in their university environment. However, the lack of effort from lecturers in planning the learning process can make it difficult for the students to optimize their learning experiences. The aim of this paper is to integrate Web 2.0 tools with learning strategy in…

  13. Integrating the nursing management minimum data set into the logical observation identifier names and codes system.

    Science.gov (United States)

    Subramanian, Amarnath; Westra, Bonnie; Matney, Susan; Wilson, Patricia S; Delaney, Connie W; Huff, Stan; Huff, Stanley M; Huber, Diane

    2008-11-06

    This poster describes the process used to integrate the Nursing Management Minimum Data Set (NMMDS), an instrument to measure the nursing context of care, into the Logical Observation Identifier Names and Codes (LOINC) system to facilitate contextualization of quality measures. Integration of the first three of 18 elements resulted in 48 new codes including five panels. The LOINC Clinical Committee has approved the presented mapping for their next release.

  14. Application of a faith-based integration tool to assess mental and physical health interventions.

    Science.gov (United States)

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  15. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-06-01

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  16. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools.

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-03-07

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  17. Geoinformation Systems as a Tool of the Integrated Tourist Spaces Management

    Directory of Open Access Journals (Sweden)

    Kolesnikovich Victor

    2014-09-01

    Full Text Available Introduction. Currently tourist activity management is in need of creating special conditions for the development of integrated management tools based on the general information and analytical base. Material and methods. The creation of architecture and the content of geoinformation and hybrid information systems are oriented at the usage of the Integrated Tourist Spaces Management (ITSM to set up a specific claim related to the features of management model. The authors created the concept of tourist space. The information and the analytical system are used to create the information model of tourist space. Information support development of ITSM system is a sort of a hybrid system: an expert system constructed on the basis of GIS. Results and conclusions. By means of GIS collecting, storage, analysis and graphic visualization of spatial data and the related information on the objects presented in an expert system is provided. The offered approach leads to the formation of an information system and the analytical maintenance of not only human decision-making, but it also promotes the creation of new tourist products based on more and more differentiated inquiries of clients or a ratio of the price and quality (from the point of view of satisfaction of inquiries.

  18. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-03-01

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  19. Simulation modelling as a tool for knowledge mobilisation in health policy settings: a case study protocol.

    Science.gov (United States)

    Freebairn, L; Atkinson, J; Kelly, P; McDonnell, G; Rychetnik, L

    2016-09-21

    Evidence-informed decision-making is essential to ensure that health programs and services are effective and offer value for money; however, barriers to the use of evidence persist. Emerging systems science approaches and advances in technology are providing new methods and tools to facilitate evidence-based decision-making. Simulation modelling offers a unique tool for synthesising and leveraging existing evidence, data and expert local knowledge to examine, in a robust, low risk and low cost way, the likely impact of alternative policy and service provision scenarios. This case study will evaluate participatory simulation modelling to inform the prevention and management of gestational diabetes mellitus (GDM). The risks associated with GDM are well recognised; however, debate remains regarding diagnostic thresholds and whether screening and treatment to reduce maternal glucose levels reduce the associated risks. A diagnosis of GDM may provide a leverage point for multidisciplinary lifestyle modification interventions. This research will apply and evaluate a simulation modelling approach to understand the complex interrelation of factors that drive GDM rates, test options for screening and interventions, and optimise the use of evidence to inform policy and program decision-making. The study design will use mixed methods to achieve the objectives. Policy, clinical practice and research experts will work collaboratively to develop, test and validate a simulation model of GDM in the Australian Capital Territory (ACT). The model will be applied to support evidence-informed policy dialogues with diverse stakeholders for the management of GDM in the ACT. Qualitative methods will be used to evaluate simulation modelling as an evidence synthesis tool to support evidence-based decision-making. Interviews and analysis of workshop recordings will focus on the participants' engagement in the modelling process; perceived value of the participatory process, perceived

  20. The Distraction in Action Tool©: Feasibility and Usability in Clinical Settings.

    Science.gov (United States)

    Hanrahan, Kirsten; Kleiber, Charmaine; Miller, Ben J; Davis, Heather; McCarthy, Ann Marie

    2017-11-10

    Distraction is a relatively simple, evidence-based intervention to minimize child distress during medical procedures. Timely on-site interventions that instruct parents on distraction coaching are needed. The purpose of this study was to test the feasibility and usability of the Distraction in Action Tool© (DAT©), which 1) predicts child risk for distress with a needle stick and 2) provides individualized instructions for parents on how to be a distraction coach for their child in clinical settings. A mixed-methods descriptive design was used to test feasibility and usability of DAT in the Emergency Department and a Phlebotomy Lab at a large Midwest Academic Medical Center. Twenty parents of children ages 4-10years requiring venipuncture and clinicians performing 13 of those procedures participated. Participants completed an evaluation and participated in a brief interview. The average age of the children was 6.8years, and 80% of parent participants were mothers. Most parents reported the DAT was not difficult to use (84.2%), understandable (100%), and they had a positive experience (89.5%). Clinicians thought DAT was helpful (100%) and did not cause a meaningful delay in workflow (92%). DAT can be used by parents and clinicians to assess their children's risk for procedure related distress and learn distraction techniques to help their children during needle stick procedures. DAT for parents is being disseminated via social media and an open-access website. Further research is needed to disseminate and implement DAT in community healthcare settings. Copyright © 2017. Published by Elsevier Inc.

  1. Constructions of contents and measures satisfying a prescribed set of integral inequalities

    DEFF Research Database (Denmark)

    Hoffmann-Jørgensen, Jørgen

    2006-01-01

    Let Ψ be a given set real-valued functions on the set T and β:Ψ→R be a given functional with values in the extended real line. The paper gives sufficent conditions for the exists of a content (or a measure) μ with good regularity and smoothnes properties and with the property that β(ψ) is less than...... the outer μ-integral of ψ for all ψ in Ψ....

  2. Value-based integrated (renal) care: setting a development agenda for research and implementation strategies.

    Science.gov (United States)

    Valentijn, Pim P; Biermann, Claus; Bruijnzeels, Marc A

    2016-08-02

    Integrated care services are considered a vital strategy for improving the Triple Aim values for people with chronic kidney disease. However, a solid scholarly explanation of how to develop, implement and evaluate such value-based integrated renal care services is limited. The aim of this study was to develop a framework to identify the strategies and outcomes for the implementation of value-based integrated renal care. First, the theoretical foundations of the Rainbow Model of Integrated Care and the Triple Aim were united into one overarching framework through an iterative process of key-informant consultations. Second, a rapid review approach was conducted to identify the published research on integrated renal care, and the Cochrane Library, Medline, Scopus, and Business Source Premier databases were searched for pertinent articles published between 2000 and 2015. Based on the framework, a coding schema was developed to synthesis the included articles. The overarching framework distinguishes the integrated care domains: 1) type of integration, 2) enablers of integration and the interrelated outcome domains, 3) experience of care, 4) population health and 5) costs. The literature synthesis indicated that integrated renal care implementation strategies have particularly focused on micro clinical processes and physical outcomes, while little emphasis has been placed on meso organisational as well as macro system integration processes. In addition, evidence regarding patients' perceived outcomes and economic outcomes has been weak. These results underscore that the future challenge for researchers is to explore which integrated care implementation strategies achieve better health and improved experience of care at a lower cost within a specific context. For this purpose, this study's framework and evidence synthesis have set a developmental agenda for both integrated renal care practice and research. Accordingly, we plan further work to develop an implementation

  3. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  4. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  5. The Webinar Integration Tool: A Framework for Promoting Active Learning in Blended Environments

    Science.gov (United States)

    Lieser, Ping; Taf, Steven D.; Murphy-Hagan, Anne

    2018-01-01

    This paper describes a three-stage process of developing a webinar integration tool to enhance the interaction of teaching and learning in blended environments. In the context of medical education, we emphasize three factors of effective webinar integration in blended learning: fostering better solutions for faculty and students to interact…

  6. Integrating Philips' extreme UV source in the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Derra, Guenther; Janssen, Maurice; Jonkers, Jeroen; Klein, Jurgen; Kruecken, Thomas; List, Andreas; Loeken, Michael; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prummer, Ralph; Rosier, Oliver; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2005-05-01

    The paper describes recent progress in the development of the Philips's EUV source. Progress has been realized at many frontiers: Integration studies of the source into a scanner have primarily been studied on the Xe source because it has a high degree of maturity. We report on integration with a collector, associated collector lifetime and optical characteristics. Collector lifetime in excess of 1 bln shots could be demonstrated. Next, an active dose control system was developed and tested on the Xe lamp. Resulting dose stability data are less than 0.2% for an exposure window of 100 pulses. The second part of the paper reports on progress in the development of the Philips' Sn source. First, the details of the concept are described. It is based on a Laser triggered vacuum arc, which is an extension with respect to previous designs. The source is furbished with rotating electrodes that are covered with a Sn film that is constantly regenerated. Hence by the very design of the source, it is scalable to very high power levels, and moreover has fundamentally solved the notorious problem of electrode erosion. Power values of 260 W in 2p sr are reported, along with a stable, long life operation of the lamp. The paper also addresses the problem of debris generation and mitigation of the Sn-source. The problem is attacked by a combined strategy of protection of the collector by traditional means (e.g. fields, foiltraps... ), and by designing the gas atmosphere according to the principles of the well known halogen cycles in incandescent lamps. These principles have been studied in the Lighting industry for decades and rely on the excessively high vapor pressures of metal halides. Transferred to the Sn source, it allows pumping away tin residues that would otherwise irreversibly deposit on the collector.

  7. Older adult mistreatment risk screening: contribution to the validation of a screening tool in a domestic setting.

    Science.gov (United States)

    Lindenbach, Jeannette M; Larocque, Sylvie; Lavoie, Anne-Marise; Garceau, Marie-Luce

    2012-06-01

    ABSTRACTThe hidden nature of older adult mistreatment renders its detection in the domestic setting particularly challenging. A validated screening instrument that can provide a systematic assessment of risk factors can facilitate this detection. One such instrument, the "expanded Indicators of Abuse" tool, has been previously validated in the Hebrew language in a hospital setting. The present study has contributed to the validation of the "e-IOA" in an English-speaking community setting in Ontario, Canada. It consisted of two phases: (a) a content validity review and adaptation of the instrument by experts throughout Ontario, and (b) an inter-rater reliability assessment by home visiting nurses. The adaptation, the "Mistreatment of Older Adult Risk Factors" tool, offers a comprehensive tool for screening in the home setting. This instrument is significant to professional practice as practitioners working with older adults will be better equipped to assess for risk of mistreatment.

  8. QFD: a methodological tool for integration of ergonomics at the design stage.

    Science.gov (United States)

    Marsot, Jacques

    2005-03-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute launched in 1999 a research program on the topic of integrating ergonomics into hand tool design. After a brief review of the problems of integrating ergonomics at the design stage, the paper shows how the "Quality Function Deployment" method has been applied to the design of a boning knife and it highlights the difficulties encountered. Then, it demonstrates how this method can be a methodological tool geared to greater ergonomics consideration in product design.

  9. Laughter Filled the Classroom: Outcomes of Professional Development in Arts Integration for Elementary Teachers in Inclusion Settings

    Science.gov (United States)

    Koch, Katherine A.; Thompson, Janna Chevon

    2017-01-01

    This qualitative study examined teachers' experiences with an arts integration curriculum. This study considered the teachers' perceptions of arts integrations before and after being introduced to the concepts of arts integration. The teachers were provided with knowledge and tools to integrate the arts into general education curriculum and…

  10. Useful tools for non-linear systems: Several non-linear integral inequalities

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mohammadpour, A.; Mesiar, Radko; Vaezpour, M. S.

    2013-01-01

    Roč. 49, č. 1 (2013), s. 73-80 ISSN 0950-7051 R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : Monotone measure * Comonotone functions * Integral inequalities * Universal integral Subject RIV: BA - General Mathematics Impact factor: 3.058, year: 2013 http://library.utia.cas.cz/separaty/2013/E/mesiar-useful tools for non-linear systems several non-linear integral inequalities.pdf

  11. An Integrated Tool for Calculating and Reducing Institution Carbon and Nitrogen Footprints

    Science.gov (United States)

    Galloway, James N.; Castner, Elizabeth A.; Andrews, Jennifer; Leary, Neil; Aber, John D.

    2017-01-01

    Abstract The development of nitrogen footprint tools has allowed a range of entities to calculate and reduce their contribution to nitrogen pollution, but these tools represent just one aspect of environmental pollution. For example, institutions have been calculating their carbon footprints to track and manage their greenhouse gas emissions for over a decade. This article introduces an integrated tool that institutions can use to calculate, track, and manage their nitrogen and carbon footprints together. It presents the methodology for the combined tool, describes several metrics for comparing institution nitrogen and carbon footprint results, and discusses management strategies that reduce both the nitrogen and carbon footprints. The data requirements for the two tools overlap substantially, although integrating the two tools does necessitate the calculation of the carbon footprint of food. Comparison results for five institutions suggest that the institution nitrogen and carbon footprints correlate strongly, especially in the utilities and food sectors. Scenario analyses indicate benefits to both footprints from a range of utilities and food footprint reduction strategies. Integrating these two footprints into a single tool will account for a broader range of environmental impacts, reduce data entry and analysis, and promote integrated management of institutional sustainability. PMID:29350217

  12. Integrated tool for NPP lifetime management in Spain

    Energy Technology Data Exchange (ETDEWEB)

    Francia, L. [UNESA, Madrid (Spain); Lopez de Santa Maria, J. [ASCO-Vandellos 2 NPPs l' Hospitalet de l' Infant, Tarragona (Spain); Cardoso, A. [Tecnatom SA, Madrid (Spain)

    2001-07-01

    The project for the Integrated Nuclear Power Plant Lifetime Management System SIGEVI (Sistema Integrado de GEstion de VIda de Centrales Nucleares) was initiated in April 1998 and finalized in December 2000, the main objective of the project being to develop a computer application facilitating the assessment of the condition and lifetime of nuclear power plant components. This constituted the second phase of a further-reaching project on NPP Lifetime Management. During the first phase of this project, carried out between 1992 and 1995, the methodology and strategy for the lifetime management of the Spanish NPP's were developed. Among others, degradation phenomena were assessed and the most adequate methods for their monitoring were defined. The SIGEVI Project has been performed under the management of UNESA (Spanish Electricity Association) and with the collaboration of different engineering firms and research institutes (Tecnatom, Empresarios Agrupados, Ufisa, Initec and IIT), with Vandellos II as the pilot plant. The rest of the Spanish NPP's have also actively participated through the Project Steering Committee. The following sections describe the scope, the structure and the main functionalities of the system SIGEVI. (authors)

  13. Integrated tool for NPP lifetime management in Spain

    International Nuclear Information System (INIS)

    Francia, L.; Lopez de Santa Maria, J.; Cardoso, A.

    2001-01-01

    The project for the Integrated Nuclear Power Plant Lifetime Management System SIGEVI (Sistema Integrado de GEstion de VIda de Centrales Nucleares) was initiated in April 1998 and finalized in December 2000, the main objective of the project being to develop a computer application facilitating the assessment of the condition and lifetime of nuclear power plant components. This constituted the second phase of a further-reaching project on NPP Lifetime Management. During the first phase of this project, carried out between 1992 and 1995, the methodology and strategy for the lifetime management of the Spanish NPP's were developed. Among others, degradation phenomena were assessed and the most adequate methods for their monitoring were defined. The SIGEVI Project has been performed under the management of UNESA (Spanish Electricity Association) and with the collaboration of different engineering firms and research institutes (Tecnatom, Empresarios Agrupados, Ufisa, Initec and IIT), with Vandellos II as the pilot plant. The rest of the Spanish NPP's have also actively participated through the Project Steering Committee. The following sections describe the scope, the structure and the main functionalities of the system SIGEVI. (authors)

  14. Critical chain project management and drum-buffer-rope tools integration in construction industry - case study

    Directory of Open Access Journals (Sweden)

    Piotr Cyplik

    2012-03-01

    Full Text Available Background: The concept of integrating the theory of constraints tools in reorganizing management system in a mechanical engineering company was presented in this article. The main aim of the concept is to enable the enterprise to satisfy the customers' expectations at reasonable costs, which allows for making a profit and creating an agile enterprise in the long run. Methods: Due to the individual character of the production process and service process in analyzed company, the described concept using theory of constraints project management (CCPM and manufacturing (DBR tools. The authors use performance levels conception to build an integration tool focused on the interaction and collaboration between different departments. The integration tool has been developed and verified in Polish manufacturing company. Results: In described model a tool compatible with CCPM operates on the level of the customer service process. Shop floor is controlled based on the DBR method. The authors hold that the integration of between TOC tools is of key importance. The integration of TOC tools dedicated to managing customer service and shop floor scheduling and controlling requires developing a mechanism for repeated transmitting the information between them. This mechanism has been developed. Conclusions: The conducted research showed that the developed tool integrating CCPM and DBR had a positive impact on the enterprise performance. It enables improving the company performance in meeting target group requirements by focusing on enhancing the efficiency of processes running in the company and tasks processed at particular work stations. The described model has been successfully implemented in one of the Polish mechanical engineering companies.

  15. An Innovative Model of Integrated Behavioral Health: School Psychologists in Pediatric Primary Care Settings

    Science.gov (United States)

    Adams, Carolyn D.; Hinojosa, Sara; Armstrong, Kathleen; Takagishi, Jennifer; Dabrow, Sharon

    2016-01-01

    This article discusses an innovative example of integrated care in which doctoral level school psychology interns and residents worked alongside pediatric residents and pediatricians in the primary care settings to jointly provide services to patients. School psychologists specializing in pediatric health are uniquely trained to recognize and…

  16. Integration: valuing stakeholder input in setting priorities for socially sustainable egg production.

    Science.gov (United States)

    Swanson, J C; Lee, Y; Thompson, P B; Bawden, R; Mench, J A

    2011-09-01

    Setting directions and goals for animal production systems requires the integration of information achieved through internal and external processes. The importance of stakeholder input in setting goals for sustainable animal production systems should not be overlooked by the agricultural animal industries. Stakeholders play an integral role in setting the course for many aspects of animal production, from influencing consumer preferences to setting public policy. The Socially Sustainable Egg Production Project (SSEP) involved the development of white papers on various aspects of egg production, followed by a stakeholder workshop to help frame the issues for the future of sustainable egg production. Representatives from the environmental, food safety, food retail, consumer, animal welfare, and the general farm and egg production sectors participated with members of the SSEP coordination team in a 1.5-d workshop to explore socially sustainable egg production. This paper reviews the published literature on values integration methodologies and the lessons learned from animal welfare assessment models. The integration method used for the SSEP stakeholder workshop and its outcome are then summarized. The method used for the SSEP stakeholder workshop can be used to obtain stakeholder input on sustainable production in other farm animal industries.

  17. Barriers to the Integration of Computers in Early Childhood Settings: Teachers' Perceptions

    Science.gov (United States)

    Nikolopoulou, Kleopatra; Gialamas, Vasilis

    2015-01-01

    This study investigated teachers' perceptions of barriers to using - integrating computers in early childhood settings. A 26-item questionnaire was administered to 134 early childhood teachers in Greece. Lack of funding, lack of technical and administrative support, as well as inadequate training opportunities were among the major perceived…

  18. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    International Nuclear Information System (INIS)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves; Kruzelecki, Karol

    2010-01-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  19. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    Energy Technology Data Exchange (ETDEWEB)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves [CERN, CH-1211 Geneva 23, PH Department, SFT Group (Switzerland); Kruzelecki, Karol, E-mail: stefan.roiser@cern.c, E-mail: ana.gaspar@cern.c, E-mail: yves.perrin@cern.c, E-mail: karol.kruzelecki@cern.c [CERN, CH-1211 Geneva 23, PH Department, LBC Group (Switzerland)

    2010-04-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  20. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  1. Water footprint as a tool for integrated water resources management

    Science.gov (United States)

    Aldaya, Maite; Hoekstra, Arjen

    2010-05-01

    together with the water footprint concept could thus provide an appropriate framework to support more optimal water management practices by informing production and trade decisions and the development and adoption of water efficient technology. In order to move towards better water governance however a further integration of water-related concerns into water-related sectoral policies is paramount. This will require a concerted effort by all stakeholders, the willingness to adopt a total resource view where water is seen as a key, cross-sectoral input for development and growth, a mix of technical approaches, and the courage to undertake and fund water sector reforms. We are convinced that the water footprint analysis can provide a sufficiently robust fact base for meaningful stakeholder dialogue and action towards solutions.

  2. Lung ultrasound as a diagnostic tool for radiographically-confirmed pneumonia in low resource settings.

    Science.gov (United States)

    Ellington, Laura E; Gilman, Robert H; Chavez, Miguel A; Pervaiz, Farhan; Marin-Concha, Julio; Compen-Chang, Patricia; Riedel, Stefan; Rodriguez, Shalim J; Gaydos, Charlotte; Hardick, Justin; Tielsch, James M; Steinhoff, Mark; Benson, Jane; May, Evelyn A; Figueroa-Quintanilla, Dante; Checkley, William

    2017-07-01

    Pneumonia is a leading cause of morbidity and mortality in children worldwide; however, its diagnosis can be challenging, especially in settings where skilled clinicians or standard imaging are unavailable. We sought to determine the diagnostic accuracy of lung ultrasound when compared to radiographically-confirmed clinical pediatric pneumonia. Between January 2012 and September 2013, we consecutively enrolled children aged 2-59 months with primary respiratory complaints at the outpatient clinics, emergency department, and inpatient wards of the Instituto Nacional de Salud del Niño in Lima, Peru. All participants underwent clinical evaluation by a pediatrician and lung ultrasonography by one of three general practitioners. We also consecutively enrolled children without respiratory symptoms. Children with respiratory symptoms had a chest radiograph. We obtained ancillary laboratory testing in a subset. Final clinical diagnoses included 453 children with pneumonia, 133 with asthma, 103 with bronchiolitis, and 143 with upper respiratory infections. In total, CXR confirmed the diagnosis in 191 (42%) of 453 children with clinical pneumonia. A consolidation on lung ultrasound, which is our primary endpoint for pneumonia, had a sensitivity of 88.5%, specificity of 100%, and an area under-the-curve of 0.94 (95% CI 0.92-0.97) when compared to radiographically-confirmed clinical pneumonia. When any abnormality on lung ultrasound was compared to radiographically-confirmed clinical pneumonia the sensitivity increased to 92.2% and the specificity decreased to 95.2%, with an area under-the-curve of 0.94 (95% CI 0.91-0.96). Lung ultrasound had high diagnostic accuracy for the diagnosis of radiographically-confirmed pneumonia. Added benefits of lung ultrasound include rapid testing and high inter-rater agreement. Lung ultrasound may serve as an alternative tool for the diagnosis of pediatric pneumonia. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights

  3. Is Google Trends a reliable tool for digital epidemiology? Insights from different clinical settings.

    Science.gov (United States)

    Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe

    2017-09-01

    Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  4. A Prospective Validation Study of a Rainbow Model of Integrated Care Measurement Tool in Singapore.

    Science.gov (United States)

    Nurjono, Milawaty; Valentijn, Pim P; Bautista, Mary Ann C; Wei, Lim Yee; Vrijhoef, Hubertus Johannes Maria

    2016-04-08

    The conceptual ambiguity of the integrated care concept precludes a full understanding of what constitutes a well-integrated health system, posing a significant challenge in measuring the level of integrated care. Most available measures have been developed from a disease-specific perspective and only measure certain aspects of integrated care. Based on the Rainbow Model of Integrated Care, which provides a detailed description of the complex concept of integrated care, a measurement tool has been developed to assess integrated care within a care system as a whole gathered from healthcare providers' and managerial perspectives. This paper describes the methodology of a study seeking to validate the Rainbow Model of Integrated Care measurement tool within and across the Singapore Regional Health System. The Singapore Regional Health System is a recent national strategy developed to provide a better-integrated health system to deliver seamless and person-focused care to patients through a network of providers within a specified geographical region. The validation process includes the assessment of the content of the measure and its psychometric properties. If the measure is deemed to be valid, the study will provide the first opportunity to measure integrated care within Singapore Regional Health System with the results allowing insights in making recommendations for improving the Regional Health System and supporting international comparison.

  5. Set-valued and fuzzy stochastic integral equations driven by semimartingales under Osgood condition

    Directory of Open Access Journals (Sweden)

    Malinowski Marek T.

    2015-01-01

    Full Text Available We analyze the set-valued stochastic integral equations driven by continuous semimartingales and prove the existence and uniqueness of solutions to such equations in the framework of the hyperspace of nonempty, bounded, convex and closed subsets of the Hilbert space L2 (consisting of square integrable random vectors. The coefficients of the equations are assumed to satisfy the Osgood type condition that is a generalization of the Lipschitz condition. Continuous dependence of solutions with respect to data of the equation is also presented. We consider equations driven by semimartingale Z and equations driven by processes A;M from decomposition of Z, where A is a process of finite variation and M is a local martingale. These equations are not equivalent. Finally, we show that the analysis of the set-valued stochastic integral equations can be extended to a case of fuzzy stochastic integral equations driven by semimartingales under Osgood type condition. To obtain our results we use the set-valued and fuzzy Maruyama type approximations and Bihari’s inequality.

  6. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  7. SQL Server 2012 data integration recipes solutions for integration services and other ETL tools

    CERN Document Server

    Aspin, Adam

    2012-01-01

    SQL Server 2012 Data Integration Recipes provides focused and practical solutions to real world problems of data integration. Need to import data into SQL Server from an outside source? Need to export data and send it to another system? SQL Server 2012 Data Integration Recipes has your back. You'll find solutions for importing from Microsoft Office data stores such as Excel and Access, from text files such as CSV files, from XML, from other database brands such as Oracle and MySQL, and even from other SQL Server databases. You'll learn techniques for managing metadata, transforming data to mee

  8. Integrated Decision Tools for Sustainable Watershed/Ground Water and Crop Health using Predictive Weather, Remote Sensing, and Irrigation Decision Tools

    Science.gov (United States)

    Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.

    2017-12-01

    US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.

  9. INTEGRATED SFM TECHNIQUES USING DATA SET FROM GOOGLE EARTH 3D MODEL AND FROM STREET LEVEL

    Directory of Open Access Journals (Sweden)

    L. Inzerillo

    2017-08-01

    Full Text Available Structure from motion (SfM represents a widespread photogrammetric method that uses the photogrammetric rules to carry out a 3D model from a photo data set collection. Some complex ancient buildings, such as Cathedrals, or Theatres, or Castles, etc. need to implement the data set (realized from street level with the UAV one in order to have the 3D roof reconstruction. Nevertheless, the use of UAV is strong limited from the government rules. In these last years, Google Earth (GE has been enriched with the 3D models of the earth sites. For this reason, it seemed convenient to start to test the potentiality offered by GE in order to extract from it a data set that replace the UAV function, to close the aerial building data set, using screen images of high resolution 3D models. Users can take unlimited “aerial photos” of a scene while flying around in GE at any viewing angle and altitude. The challenge is to verify the metric reliability of the SfM model carried out with an integrated data set (the one from street level and the one from GE aimed at replace the UAV use in urban contest. This model is called integrated GE SfM model (i-GESfM. In this paper will be present a case study: the Cathedral of Palermo.

  10. Integrated Data Collection Analysis (IDCA) Program - RDX Standard Data Set 2

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Air Force Research Lab. (AFRL), Tyndall Air Force Base, FL (United States); Shelley, Timothy J. [Applied Research Associates, Tyndall Air Force Base, FL (United States); Reyes, Jose A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-02-20

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, from testing the second time in the Proficiency Test. This RDX testing (Set 2) compared to the first (Set 1) was found to have about the same impact sensitivity, have more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity.

  11. Integrated Berth Allocation and Quay Crane Assignment Problem: Set partitioning models and computational results

    DEFF Research Database (Denmark)

    Iris, Cagatay; Pacino, Dario; Røpke, Stefan

    2015-01-01

    Most of the operational problems in container terminals are strongly interconnected. In this paper, we study the integrated Berth Allocation and Quay Crane Assignment Problem in seaport container terminals. We will extend the current state-of-the-art by proposing novel set partitioning models....... To improve the performance of the set partitioning formulations, a number of variable reduction techniques are proposed. Furthermore, we analyze the effects of different discretization schemes and the impact of using a time-variant/invariant quay crane allocation policy. Computational experiments show...

  12. Follow up: Compound data sets and software tools for chemoinformatics and medicinal chemistry applications: update and data transfer

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2014-01-01

    In 2012, we reported 30 compound data sets and/or programs developed in our laboratory in a data article and made them freely available to the scientific community to support chemoinformatics and computational medicinal chemistry applications. These data sets and computational tools were provided for download from our website. Since publication of this data article, we have generated 13 new data sets with which we further extend our collection of publicly available data and tools. Due to changes in web servers and website architectures, data accessibility has recently been limited at times. Therefore, we have also transferred our data sets and tools to a public repository to ensure full and stable accessibility. To aid in data selection, we have classified the data sets according to scientific subject areas. Herein, we describe new data sets, introduce the data organization scheme, summarize the database content and provide detailed access information in ZENODO (doi: 10.5281/zenodo.8451 and doi:10.5281/zenodo.8455). PMID:25520777

  13. An approach and a tool for setting sustainable energy retrofitting strategies referring to the 2010 EP

    Directory of Open Access Journals (Sweden)

    Charlot-Valdieu, C.

    2011-10-01

    Full Text Available The 2010 EPBD asks for an economic and social analysis in order to preserve social equity and to promote innovation and building productivity. This is possible with a life cycle energy cost (LCEC analysis, such as with the SEC (Sustainable Energy Cost model whose bottom up approach begins with a building typology including inhabitants. Then the analysis of some representative buildings includes the identification of a technico-economical optimum and energy retrofitting scenarios for each retrofitting programme and the extrapolation for the whole building stock. An extrapolation for the whole building stock allows to set up the strategy and to identify the needed means for reaching the objectives. SEC is a decision aid tool for optimising sustainable energy retrofitting strategies for buildings at territorial and patrimonial scales inside a sustainable development approach towards the factor 4. Various versions of the SEC model are now available for housing and for tertiary buildings.

    La directiva europea de 2010 sobre eficiencia energética en los edificios exige un análisis económico y social con el objetivo de preservar la equidad social, promover la innovación y reforzar la productividad en la construcción. Esto es posible con el análisis del coste global ampliado y especialmente con el modelo SEC. El análisis “bottom up” realizado con la SEC se basa en una tipología de edificio/usuario y en el análisis de edificios representativos: la identificación del óptimo técnico-económico y elaboración de escenarios antes de hacer una extrapolación al conjunto del parque. SEC es una herramienta de ayuda a la decisión para desarrollar estrategias territoriales o patrimoniales de rehabilitación energética. Existen diversas versiones del modelo: para edificios residenciales (unifamiliares y plurifamiliares, públicos y privados y para edificios terciarios.

  14. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Directory of Open Access Journals (Sweden)

    Huu-Tho Nguyen

    Full Text Available Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process and a fuzzy COmplex PRoportional ASsessment (COPRAS for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  15. NEON's Mobile Deployment Platform: A research tool for integrating ecological processes across scales

    Science.gov (United States)

    Sanclements, M.

    2016-12-01

    Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provides a tool for linking PI led research to the continental scale data sets collected by NEON.

  16. BACHSCORE. A tool for evaluating efficiently and reliably the quality of large sets of protein structures

    Science.gov (United States)

    Sarti, E.; Zamuner, S.; Cossio, P.; Laio, A.; Seno, F.; Trovato, A.

    2013-12-01

    In protein structure prediction it is of crucial importance, especially at the refinement stage, to score efficiently large sets of models by selecting the ones that are closest to the native state. We here present a new computational tool, BACHSCORE, that allows its users to rank different structural models of the same protein according to their quality, evaluated by using the BACH++ (Bayesian Analysis Conformation Hunt) scoring function. The original BACH statistical potential was already shown to discriminate with very good reliability the protein native state in large sets of misfolded models of the same protein. BACH++ features a novel upgrade in the solvation potential of the scoring function, now computed by adapting the LCPO (Linear Combination of Pairwise Orbitals) algorithm. This change further enhances the already good performance of the scoring function. BACHSCORE can be accessed directly through the web server: bachserver.pd.infn.it. Catalogue identifier: AEQD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEQD_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 130159 No. of bytes in distributed program, including test data, etc.: 24 687 455 Distribution format: tar.gz Programming language: C++. Computer: Any computer capable of running an executable produced by a g++ compiler (4.6.3 version). Operating system: Linux, Unix OS-es. RAM: 1 073 741 824 bytes Classification: 3. Nature of problem: Evaluate the quality of a protein structural model, taking into account the possible “a priori” knowledge of a reference primary sequence that may be different from the amino-acid sequence of the model; the native protein structure should be recognized as the best model. Solution method: The contact potential scores the occurrence of any given type of residue pair in 5 possible

  17. Integration at the round table: marine spatial planning in multi-stakeholder settings.

    Directory of Open Access Journals (Sweden)

    Erik Olsen

    Full Text Available Marine spatial planning (MSP is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments, economic activities (and related priorities, external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors is successful in all three countries, achieved through the use of neutral 'round-table' meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes.

  18. Integration at the Round Table: Marine Spatial Planning in Multi-Stakeholder Settings

    Science.gov (United States)

    Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen

    2014-01-01

    Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and related priorities), external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors) is successful in all three countries, achieved through the use of neutral ‘round-table’ meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes. PMID:25299595

  19. Integration at the round table: marine spatial planning in multi-stakeholder settings.

    Science.gov (United States)

    Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen

    2014-01-01

    Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and related priorities), external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors) is successful in all three countries, achieved through the use of neutral 'round-table' meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes.

  20. Development and initial feasibility of an organizational measure of behavioral health integration in medical care settings.

    Science.gov (United States)

    McGovern, Mark P; Urada, Darren; Lambert-Harris, Chantal; Sullivan, Steven T; Mazade, Noel A

    2012-12-01

    In the advent of health care reform, models are sought to integrate behavioral health and routine medical care services. Historically, the behavioral health specialty has not itself been integrated, but instead bifurcated by substance use and mental health across treatment systems, care providers and even research. With the present opportunity to transform the health care delivery system, it is incumbent upon policymakers, researchers and clinicians to avoid repeating this historical error, and provide integrated behavioral health services in medical contexts. An organizational measure designed to assess this capacity is described: the Dual Diagnosis Capability in Health Care Settings (DDCHCS). The DDCHCS was used to assess a sample of federally-qualified health centers (N=13) on the degree of behavioral health integration. The measure was found to be feasible and sensitive to detecting variation in integrated behavioral health services capacity. Three of the 13 agencies were dual diagnosis capable, with significant variation in DDCHCS dimensions measuring staffing, treatment practices and program milieu. In general, mental health services were more integrated than substance use. Future research should consider a revised version of the measure, a larger and more representative sample, and linking organizational capacity with patient outcomes. Copyright © 2012. Published by Elsevier Inc.

  1. Laccase-13 Regulates Seed Setting Rate by Affecting Hydrogen Peroxide Dynamics and Mitochondrial Integrity in Rice

    Directory of Open Access Journals (Sweden)

    Yang Yu

    2017-07-01

    Full Text Available Seed setting rate is one of the most important components of rice grain yield. To date, only several genes regulating setting rate have been identified in plant. In this study, we showed that laccase-13 (OsLAC13, a member of laccase family genes which are known for their roles in modulating phenylpropanoid pathway and secondary lignification in cell wall, exerts a regulatory function in rice seed setting rate. OsLAC13 expressed in anthers and promotes hydrogen peroxide production both in vitro and in the filaments and anther connectives. Knock-out of OsLAC13 showed significantly increased seed setting rate, while overexpression of this gene exhibited induced mitochondrial damage and suppressed sugar transportation in anthers, which in turn affected seed setting rate. OsLAC13 also induced H2O2 production and mitochondrial damage in the root tip cells which caused the lethal phenotype. We also showed that high abundant of OsmiR397, the suppressor of OsLAC13 mRNA, increased the seed setting rate of rice plants, and restrains H2O2 accumulation in roots during oxidative stress. Our results suggested a novel regulatory role of OsLAC13 gene in regulating seed setting rate by affecting H2O2 dynamics and mitochondrial integrity in rice.

  2. Integrating cultural community psychology: activity settings and the shared meanings of intersubjectivity.

    Science.gov (United States)

    O'Donnell, Clifford R; Tharp, Roland G

    2012-03-01

    Cultural and community psychology share a common emphasis on context, yet their leading journals rarely cite each other's articles. Greater integration of the concepts of culture and community within and across their disciplines would enrich and facilitate the viability of cultural community psychology. The contextual theory of activity settings is proposed as one means to integrate the concepts of culture and community in cultural community psychology. Through shared activities, participants develop common experiences that affect their psychological being, including their cognitions, emotions, and behavioral development. The psychological result of these experiences is intersubjectivity. Culture is defined as the shared meanings that people develop through their common historic, linguistic, social, economic, and political experiences. The shared meanings of culture arise through the intersubjectivity developed in activity settings. Cultural community psychology presents formidable epistemological challenges, but overcoming these challenges could contribute to the transformation and advancement of community psychology.

  3. Early wound infection identification using the WIRE tool in community health care settings: An audit report.

    Science.gov (United States)

    Siaw-Sakyi, Vincent

    2017-12-01

    Wound infection is proving to be a challenge for health care professionals. The associated complications and cost of wound infection is immense and can lead to death in extreme cases. Current management of wound infection is largely subjective and relies on the knowledge of the health care professional to identify and initiate treatment. In response, we have developed an infection prediction and assessment tool. The Wound Infection Risk-Assessment and Evaluation tool (WIRE) and its management strategy is a tool with the aim to bring objectivity to infection prediction, assessment and management. A local audit carried out indicated a high infection prediction rate. More work is being done to improve its effectiveness.

  4. Using registries to integrate bioinformatics tools and services into workbench environments

    DEFF Research Database (Denmark)

    Ménager, Hervé; Kalaš, Matúš; Rapacki, Kristoffer

    2016-01-01

    The diversity and complexity of bioinformatics resources presents significant challenges to their localisation, deployment and use, creating a need for reliable systems that address these issues. Meanwhile, users demand increasingly usable and integrated ways to access and analyse data, especially......, a software component that will ease the integration of bioinformatics resources in a workbench environment, using their description provided by the existing ELIXIR Tools and Data Services Registry....

  5. Integration at the Round Table: Marine Spatial Planning in Multi-Stakeholder Settings

    OpenAIRE

    Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen

    2014-01-01

    Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and ...

  6. Set up of a method for the adjustment of resonance parameters on integral experiments

    International Nuclear Information System (INIS)

    Blaise, P.

    1996-01-01

    Resonance parameters for actinides play a significant role in the neutronic characteristics of all reactor types. All the major integral parameters strongly depend on the nuclear data of the isotopes in the resonance-energy regions.The author sets up a method for the adjustment of resonance parameters taking into account the self-shielding effects and restricting the cross section deconvolution problem to a limited energy region. (N.T.)

  7. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  8. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  9. On set-valued functionals: Multivariate risk measures and Aumann integrals

    Science.gov (United States)

    Ararat, Cagin

    In this dissertation, multivariate risk measures for random vectors and Aumann integrals of set-valued functions are studied. Both are set-valued functionals with values in a complete lattice of subsets of Rm. Multivariate risk measures are considered in a general d-asset financial market with trading opportunities in discrete time. Specifically, the following features of the market are incorporated in the evaluation of multivariate risk: convex transaction costs modeled by solvency regions, intermediate trading constraints modeled by convex random sets, and the requirement of liquidation into the first m ≤ d of the assets. It is assumed that the investor has a "pure" multivariate risk measure R on the space of m-dimensional random vectors which represents her risk attitude towards the assets but does not take into account the frictions of the market. Then, the investor with a d-dimensional position minimizes the set-valued functional R over all m-dimensional positions that she can reach by trading in the market subject to the frictions described above. The resulting functional Rmar on the space of d-dimensional random vectors is another multivariate risk measure, called the market-extension of R. A dual representation for R mar that decomposes the effects of R and the frictions of the market is proved. Next, multivariate risk measures are studied in a utility-based framework. It is assumed that the investor has a complete risk preference towards each individual asset, which can be represented by a von Neumann-Morgenstern utility function. Then, an incomplete preference is considered for multivariate positions which is represented by the vector of the individual utility functions. Under this structure, multivariate shortfall and divergence risk measures are defined as the optimal values of set minimization problems. The dual relationship between the two classes of multivariate risk measures is constructed via a recent Lagrange duality for set optimization. In

  10. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  11. The youth sports club as a health-promoting setting: An integrative review of research

    Science.gov (United States)

    Quennerstedt, Mikael; Eriksson, Charli

    2013-01-01

    Aims: The aims of this review is to compile and identify key issues in international research about youth sports clubs as health-promoting settings, and then discuss the results of the review in terms of a framework for the youth sports club as a health-promoting setting. Methods: The framework guiding this review of research is the health-promoting settings approach introduced by the World Health Organization (WHO). The method used is the integrated review. Inclusion criteria were, first, that the studies concerned sports clubs for young people, not professional clubs; second, that it be a question of voluntary participation in some sort of ongoing organized athletics outside of the regular school curricula; third, that the studies consider issues about youth sports clubs in terms of health-promoting settings as described by WHO. The final sample for the review consists of 44 publications. Results: The review shows that youth sports clubs have plentiful opportunities to be or become health-promoting settings; however this is not something that happens automatically. To do so, the club needs to include an emphasis on certain important elements in its strategies and daily practices. The youth sports club needs to be a supportive and healthy environment with activities designed for and adapted to the specific age-group or stage of development of the youth. Conclusions: To become a health-promoting setting, a youth sports club needs to take a comprehensive approach to its activities, aims, and purposes. PMID:23349167

  12. epsilon : A tool to find a canonical basis of master integrals

    Science.gov (United States)

    Prausa, Mario

    2017-10-01

    In 2013, Henn proposed a special basis for a certain class of master integrals, which are expressible in terms of iterated integrals. In this basis, the master integrals obey a differential equation, where the right hand side is proportional to ɛ in d = 4 - 2 ɛ space-time dimensions. An algorithmic approach to find such a basis was found by Lee. We present the tool epsilon, an efficient implementation of Lee's algorithm based on the Fermat computer algebra system as computational back end.

  13. Developing Indicators for a Classroom Observation Tool on Pedagogy and Technology Integration: A Delphi Study

    Science.gov (United States)

    Elmendorf, Douglas C.; Song, Liyan

    2015-01-01

    Rapid advances in technology and increased access to technology tools have created new instructional demands and expectations on teachers. Due to the ubiquitous presence of technology in K-12 schools, teachers are being observed on both their pedagogical and technology integration practices. Applying the technological pedagogical and content…

  14. Integrating Wikis as Educational Tools for the Development of a Community of Inquiry

    Science.gov (United States)

    Eteokleous, Nikleia; Ktoridou, Despo; Orphanou, Maria

    2014-01-01

    This article describes a study that attempted to evaluate the integration of wikis as an educational tool in successfully achieving the learning objectives of a fifth-grade linguistics and literature course. A mixed-method approach was employed--data were collected via questionnaires, reflective journals, observations, and interviews. The results…

  15. Six sigma tools in integrating internal operations of a retail pharmacy: a case study.

    Science.gov (United States)

    Kumar, Sameer; Kwong, Anthony M

    2011-01-01

    This study was initiated to integrate information and enterprise-wide healthcare delivery system issues specifically within an inpatient retail pharmacy operation in a U.S. community hospital. Six Sigma tools were used to examine the effects to an inpatient retail pharmacy service process. Some of the tools used include service blueprints, cause-effect diagram, gap analysis derived from customer and employee surveys, mistake proofing was applied in various business situations and results were analyzed to identify and propose process improvements and integration. The research indicates that the Six Sigma tools in this discussion are very applicable and quite effective in helping to streamline and integrate the pharmacy process flow. Additionally, gap analysis derived from two different surveys was used to estimate the primary areas of focus to increase customer and employee satisfaction. The results of this analysis were useful in initiating discussions of how to effectively narrow these service gaps. This retail pharmaceutical service study serves as a framework for the process that should occur for successful process improvement tool evaluation and implementation. Pharmaceutical Service operations in the U.S. that use this integration framework must tailor it to their individual situations to maximize their chances for success.

  16. The Integration of Digital Tools during Strategic and Interactive Writing Instruction

    Science.gov (United States)

    Kilpatrick, Jennifer Renée; Saulsburry, Rachel; Dostal, Hannah M.; Wolbers, Kimberly A.; Graham, Steve

    2014-01-01

    The purpose of this chapter is to gain insight from the ways a group of elementary teachers of the deaf and hard of hearing chose to integrate digital tools into evidence-based writing instruction and the ways these technologies were used to support student learning. After professional development that exposed these teachers to twelve new digital…

  17. Tool Integration: Experiences and Issues in Using XMI and Component Technology

    DEFF Research Database (Denmark)

    Damm, Christian Heide; Hansen, Klaus Marius; Thomsen, Michael

    2000-01-01

    of conflicting data models, and provide architecture for doing so, based on component technology and XML Metadata Interchange. As an example, we discuss the implementation of an electronic whiteboard tool, Knight, which adds support for creative and collaborative object-oriented modeling to existing Computer-Aided...... Software Engineering through integration using our proposed architecture....

  18. Integrating Social Networking Tools into ESL Writing Classroom: Strengths and Weaknesses

    Science.gov (United States)

    Yunus, Melor Md; Salehi, Hadi; Chenzi, Chen

    2012-01-01

    With the rapid development of world and technology, English learning has become more important. Teachers frequently use teacher-centered pedagogy that leads to lack of interaction with students. This paper aims to investigate the advantages and disadvantages of integrating social networking tools into ESL writing classroom and discuss the ways to…

  19. Collaborative Digital Games as Mediation Tool to Foster Intercultural Integration in Primary Dutch Schools

    NARCIS (Netherlands)

    A. Paz Alencar (Amanda); T. de la Hera Conde-Pumpido (Teresa)

    2015-01-01

    textabstractIn the Netherlands, the growing presence of immigrant children in schools has fueled scholarly interest in and concerns for examining the process of integration in school environments. The use of digital games has found to be an effective tool to reinforce teaching/learning practices.

  20. From data to the decision: A software architecture to integrate predictive modelling in clinical settings.

    Science.gov (United States)

    Martinez-Millana, A; Fernandez-Llatas, C; Sacchi, L; Segagni, D; Guillen, S; Bellazzi, R; Traver, V

    2015-08-01

    The application of statistics and mathematics over large amounts of data is providing healthcare systems with new tools for screening and managing multiple diseases. Nonetheless, these tools have many technical and clinical limitations as they are based on datasets with concrete characteristics. This proposition paper describes a novel architecture focused on providing a validation framework for discrimination and prediction models in the screening of Type 2 diabetes. For that, the architecture has been designed to gather different data sources under a common data structure and, furthermore, to be controlled by a centralized component (Orchestrator) in charge of directing the interaction flows among data sources, models and graphical user interfaces. This innovative approach aims to overcome the data-dependency of the models by providing a validation framework for the models as they are used within clinical settings.

  1. Selection of the optimal set of revenue management tools in hotels

    OpenAIRE

    Korzh, Nataliia; Onyshchuk, Natalia

    2017-01-01

    The object of research is the scientific category «revenue management» and its tools, which, with the growth of the number of on-line sales channels of hotel services, become decisive in the struggle for survival. The existence of a large number of profit management tools associated with the online booking regime work as a SmallDat and gives quite scattered information about the state of the market. One of the most problematic areas is the formation of perspective analytics using existing too...

  2. An Integrated Development Tool for a safety application using FBD language

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jun; Lee, Jang Soo; Lee, Dong Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Regarding digitalizing the Nuclear Instrumentation and Control Systems, the application program responsible for the safety functions of Nuclear I and C Systems shall ensure the robustness of the safety function through development, testing, and validation roles for a life cycle process during software development. The importance of software in nuclear systems increases continuously. The integrated engineering tools to develop, test, and validate safety application programs require increasingly more complex parts among a number of components within nuclear digital I and C systems. This paper introduces the integrated engineering tool (SafeCASE-PLC) developed by our project. The SafeCASE-PLC is a kind of software engineering tool to develop, test, and validate the nuclear application program performed in an automatic controller

  3. Integrated waste management and the tool of life cycle inventory : a route to sustainable waste management

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, F.R.; White, P.R. [Procter and Gamble Newcastle Technical Centre, Newcastle (United Kingdom). Corporate Sustainable Development

    2000-07-01

    An overall approach to municipal waste management which integrates sustainable development principles was discussed. The three elements of sustainability which have to be balanced are environmental effectiveness, economic affordability and social acceptability. An integrated waste management (IWM) system considers different treatment options and deals with the entire waste stream. A life cycle inventory (LCI) and life cycle assessment (LCA) is used to determine the environmental burdens associated with IWM systems. LCIs for waste management are currently available for use in Europe, the United States, Canada and elsewhere. LCI is being used by waste management companies to assess the environmental attributes of future contract tenders. The models are used as benchmarking tools to assess the current environmental profile of a waste management system. They are also a comparative planning and communication tool. The authors are currently looking into publishing, at a future date, the experience of users of this LCI environmental management tool. 12 refs., 3 figs.

  4. Tools of integration of innovation-oriented machine-building enterprises in industrial park environment

    Directory of Open Access Journals (Sweden)

    К.О. Boiarynova

    2017-08-01

    Full Text Available The research is devoted to the development of the tools for the integration of innovation-oriented mechanical engineering enterprises into the environment of industrial park as functional economic systems, which are capable on the own development basis to provide the development of resident enterprises. The article analyzes the opportunities for the development of mechanical engineering enterprises. The formed structure of the mechanism of integration of mechanical engineering enterprises as functional economic systems into the industrial park environment is based on: 1 the development of participation programs in the industrial park of the mechanical engineering enterprises as an innovation-oriented partner, which foresees the development of the enterprise immediately and the development of other residents; 2 the provision of high-tech equipment of resident enterprises of industrial parks; 3 the creation of subsidiary-spin-out enterprises of large mechanical engineering enterprises for high-tech production in the industrial park. The author proposes the road map that reveals the procedures for the integration and functioning the investigated enterprises through interaction as well as in the ecosystem of the industrial park and in the general ecosystem of functioning, and the tools for providing economic functionality through economic and organizational proceedings at preventive, partner and resident phases of integration. The tools allow the innovation-oriented mechanical engineering enterprises to integrate into such territorial structures as industrial parks, this in complex will allow carrying out their purposes in the development of the real sector of the economy.

  5. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  6. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  7. Processing: A Python Framework for the Seamless Integration of Geoprocessing Tools in QGIS

    Directory of Open Access Journals (Sweden)

    Anita Graser

    2015-10-01

    Full Text Available Processing is an object-oriented Python framework for the popular open source Geographic Information System QGIS, which provides a seamless integration of geoprocessing tools from a variety of different software libraries. In this paper, we present the development history, software architecture and features of the Processing framework, which make it a versatile tool for the development of geoprocessing algorithms and workflows, as well as an efficient integration platform for algorithms from different sources. Using real-world application examples, we furthermore illustrate how the Processing architecture enables typical geoprocessing use cases in research and development, such as automating and documenting workflows, combining algorithms from different software libraries, as well as developing and integrating custom algorithms. Finally, we discuss how Processing can facilitate reproducible research and provide an outlook towards future development goals.

  8. The VI-Suite: a set of environmental analysis tools with geospatial data applications

    NARCIS (Netherlands)

    Southall, Ryan; Biljecki, F.

    2017-01-01

    Background: The VI-Suite is a free and open-source addon for the 3D content creation application Blender, developed primarily as a tool for the contextual and performative analysis of buildings. Its functionality has grown from simple, static lighting analysis to fully parametric lighting,

  9. Developing a free and easy to use digital goal setting tool for busy mums

    Directory of Open Access Journals (Sweden)

    Babs Evans

    2015-09-01

    Using data, research and the expertise of commercial and charity partners was an effective way to design a digital product to support behavioural change. By understanding the target audience from the beginning and involving them in the planning stages, the organisations were able to develop a tool the users want with a strong focus on user experience.

  10. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki; Beyer, Johanna; Hadwiger, Markus; Blue, Rusty; Law, Charles; Vá zquez Reina, Amelio; Reid, Rollie Clay; Lichtman, Jeff W M D; Pfister, Hanspeter

    2010-01-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  11. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki

    2010-05-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  12. Developmental screening tools: feasibility of use at primary healthcare level in low- and middle-income settings.

    Science.gov (United States)

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-06-01

    An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools.

  13. Integrating mHealth at point of care in low- and middle-income settings: the system perspective.

    Science.gov (United States)

    Wallis, Lee; Blessing, Paul; Dalwai, Mohammed; Shin, Sang Do

    2017-06-01

    While the field represents a wide spectrum of products and services, many aspects of mHealth have great promise within resource-poor settings: there is an extensive range of cheap, widely available tools which can be used at the point of care delivery. However, there are a number of conditions which need to be met if such solutions are to be adequately integrated into existing health systems; we consider these from regulatory, technological and user perspectives. We explore the need for an appropriate legislative and regulatory framework, to avoid 'work around' solutions, which threaten patient confidentiality (such as the extensive use of instant messaging services to deliver sensitive clinical information and seek diagnostic and management advice). In addition, we will look at other confidentiality issues such as the need for applications to remove identifiable information (such as photos) from users' devices. Integration is dependent upon multiple technological factors, and we illustrate these using examples such as products made available specifically for adoption in low- and middle-income countries. Issues such as usability of the application, signal loss, data volume utilization, need to enter passwords, and the availability of automated or in-app context-relevant clinical advice will be discussed. From a user perspective, there are three groups to consider: experts, front-line clinicians, and patients. Each will accept, to different degrees, the use of technology in care - often with cultural or regional variation - and this is central to integration and uptake. For clinicians, ease of integration into daily work flow is critical, as are familiarity and acceptability of other technology in the workplace. Front-line staff tend to work in areas with more challenges around cell phone signal coverage and data availability than 'back-end' experts, and the effect of this is discussed.

  14. Integrated Authoring Tool for Mobile Augmented Reality-Based E-Learning Applications

    Science.gov (United States)

    Lobo, Marcos Fermin; Álvarez García, Víctor Manuel; del Puerto Paule Ruiz, María

    2013-01-01

    Learning management systems are increasingly being used to complement classroom teaching and learning and in some instances even replace traditional classroom settings with online educational tools. Mobile augmented reality is an innovative trend in e-learning that is creating new opportunities for teaching and learning. This article proposes a…

  15. Self-organising maps and correlation analysis as a tool to explore patterns in excitation-emission matrix data sets and to discriminate dissolved organic matter fluorescence components.

    Science.gov (United States)

    Ejarque-Gonzalez, Elisabet; Butturini, Andrea

    2014-01-01

    Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

  16. The process of care in integrative health care settings - a qualitative study of US practices.

    Science.gov (United States)

    Grant, Suzanne J; Bensoussan, Alan

    2014-10-23

    There is a lack of research on the organisational operations of integrative healthcare (IHC) practices. IHC is a therapeutic strategy integrating conventional and complementary medicine in a shared context to administer individualized treatment. To better understand the process of care in IHC - the way in which patients are triaged and treatment plans are constructed, interviews were conducted with integrative health care leaders and practitioners in the US. Semi-structured interviews were conducted with a pragmatic group of fourteen leaders and practitioners from nine different IHC settings. All interviews were conducted face-to-face with the exception of one phone interview. Questions focussed on understanding the "process of care" in an integrative healthcare setting. Deductive categories were formed from the aims of the study, focusing on: organisational structure, processes of care (subcategories: patient intake, treatment and charting, use of guidelines or protocols), prevalent diseases or conditions treated, and the role of research in the organisation. The similarities and differences of the ITH entities emerged from this process. On an organisational level, conventional and CM services and therapies were co-located in all nine settings. For patients, this means there is more opportunity for 'seamless care'. Shared information systems enabled easy communication using internal messaging or email systems, and shared patient intake information. But beyond this infrastructure alignment for integrative health care was less supported. There were no use of protocols or guidelines within any centre, no patient monitoring mechanism beyond that which occurred within one-on-one appointments. Joint planning for a patient treatment was typically ad hoc through informal mechanisms. Additional duties typically come at a direct financial cost to fee-for-service practitioners. In contrast, service delivery and the process of care within hospital inpatient services followed

  17. Multi-particle phase space integration with arbitrary set of singularities in CompHEP

    International Nuclear Information System (INIS)

    Kovalenko, D.N.; Pukhov, A.E.

    1997-01-01

    We describe an algorithm of multi-particle phase space integration for collision and decay processes realized in CompHEP package version 3.2. In the framework of this algorithm it is possible to regularize an arbitrary set of singularities caused by virtual particle propagators. The algorithm is based on the method of the recursive representation of kinematics and on the multichannel Monte Carlo approach. CompHEP package is available by WWW: http://theory.npi.msu.su/pukhov/comphep. html (orig.)

  18. Set of CAMAC modules on the base of large integrated circuits for an accelerator synchronization system

    International Nuclear Information System (INIS)

    Glejbman, Eh.M.; Pilyar, N.V.

    1986-01-01

    Parameters of functional moduli in the CAMAC standard developed for accelerator synchronization system are presented. They comprise BZN-8K and BZ-8K digital delay circuits, timing circuit and pulse selection circuit. In every module 3 large integral circuits of KR 580 VI53 type programmed timer, circuits of the given system bus bar interface with bus bars of crate, circuits of data recording control, 2 peripheric storage devices, circuits of initial regime setting, input and output shapers, circuits of installation and removal of blocking in channels are used

  19. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  20. Integrated Data Collection Analysis (IDCA) Program — RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-04

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, for a third and fourth time in the Proficiency Test and averaged with the analysis results from the first and second time. The results, from averaging all four sets (1, 2, 3 and 4) of data suggest a material to have slightly more impact sensitivity, more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity, compared to the results from Set 1, which was used previously as the values for the RDX standard in IDCA Analysis Reports.

  1. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  2. Developmental Screening Tools: Feasibility of Use at Primary Healthcare Level in Low- and Middle-income Settings

    OpenAIRE

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-01-01

    ABSTRACT An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducte...

  3. Structural Integrity Analysis considered Load Combination for the Conceptual Design of Korean HCCR TBM-set

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won; Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon [KAERI, Daejeon (Korea, Republic of); Shin, Kyu In [Gentec Tech, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    HCCR TBM (Test Blanket Module) set is consist of 4-TBM sub module, one blanket manifold (BM), a shield, and 4-key, which has a function of a connection between BM and the shield. And it shall be installed in the equatorial port No.18 of ITER inside the vacuum vessel directly facing the plasma and shall be cooled by a high-temperature helium coolant. In addition, the HCCR TBM-set safety classification follows the ITER (international thermonuclear reactor) safety importance class (SIC) criteria, and satisfies a design requirement according to RCC-MRx. In this study, some of load combination (LC) analysis for the structure integrity of TBM set were carried out based on the reference. And the LC results showed that they satisfied the design requirement. The material of TBM-set was used from the reference, and RCC-MRx for the stress analysis. In this study, the load combination results were met a design requirement. But some load combination case gave a higher maximum stress value than a design requirement and in these case the stress breakdown analysis according to RCC-MRx was performed, and the result were satisfied the design requirement.

  4. Reachable Sets of Hidden CPS Sensor Attacks : Analysis and Synthesis Tools

    NARCIS (Netherlands)

    Murguia, Carlos; van de Wouw, N.; Ruths, Justin; Dochain, Denis; Henrion, Didier; Peaucelle, Dimitri

    2017-01-01

    For given system dynamics, control structure, and fault/attack detection procedure, we provide mathematical tools–in terms of Linear Matrix Inequalities (LMIs)–for characterizing and minimizing the set of states that sensor attacks can induce in the system while keeping the alarm rate of the

  5. Using Multiattribute Utility Theory as a Priority-Setting Tool in Human Services Planning.

    Science.gov (United States)

    Camasso, Michael J.; Dick, Janet

    1993-01-01

    The feasibility of applying multiattribute utility theory to the needs assessment and priority-setting activities of human services planning councils was studied in Essex County (New Jersey). Decision-making and information filtering processes are explored in the context of community planning. (SLD)

  6. Integrating data from the Investigational Medicinal Product Dossier/investigator's brochure. A new tool for translational integration of preclinical effects.

    Science.gov (United States)

    van Gerven, Joop; Cohen, Adam

    2018-01-30

    The first administration of a new compound in humans is an important milestone. A major source of information for the researcher is the investigator's brochure (IB). Such a document, has a size of several hundred pages. The IB should enable investigators or regulators to independently assess the risk-benefit of the proposed trial but the size and complexity makes this difficult. This article offers a practical tool for the integration and subsequent communication of the complex information from the IB or other relevant data sources. This paper is accompanied by an accessible software tool to construct a single page colour-coded overview of preclinical and clinical data. © 2018 The British Pharmacological Society.

  7. Setting UP a decontamination and dismantling (D and D) scenario - methodology and tools developed leopard

    International Nuclear Information System (INIS)

    Pradoura, F.

    2009-01-01

    At the AREVA NC La Hague site, the former nuclear spent fuel reprocessing plant UP2-400 was shutdown on December 30, 2003. Since then, the cleaning up and dismantling activities have been carried by the DV/PRO project, which is the program management organization settled by AREVA NC, for valorization projects. SGN, part of the AREVA NC Engineering Business Unit, operates as the main contractor of the DV/PRO project and provides project management services related to decommissioning and waste management. Hence, SGN is in charge of building D and D's scenarios for all the facilities of the UP2-400 plant, in compliance with safety, technical and financial requirements. Main outputs are logic diagrams, block flow diagrams, wastes and effluents throughputs. In order to meet with AREVA NC's requirements and expectations, SGN developed specific process and tools methods adapted to the scale and complexity of decommissioning a plant with several facilities, with different kind of processes (chemical, mechanical), some of which are in operation and other being dismantled. Considering the number of technical data and inputs to be managed, this methodology leads to complex outputs such as schedules, throughputs, work packages... The development, the maintenance and the modification of these outputs become more and more difficult with the complexity and the size of the plant considered. To cope with these issues, SGN CDE/DEM UP2-400 project team has developed a dedicated tool to assist and optimize in elaborating D and D scenarios. This tool is named LEOPARD (Logiciel d'Elaboration et d'Optimisation des Programmes d'Assainissement Radiologique et de Demantelement) (Software for the Development and Optimization of Radiological Clean up and Dismantling Programs). The availability of this tool allowed the rapid construction of a test case (demonstrator) that has convinced DV/PRO of its numerous advantages and of the future further development potentials. Presentations of LEOPARD

  8. Integrated Reporting as a Tool for Communicating with Stakeholders - Advantages and Disadvantages

    Science.gov (United States)

    Matuszyk, Iwona; Rymkiewicz, Bartosz

    2018-03-01

    Financial and non-financial reporting from the beginning of its existence is the primary source of communication between the company and a wide range of stakeholders. Over the decades it has adapted to the needs of rapidly changing business and social environment. Currently, the final link in the evolution of organizational reporting, such as integrated reporting, assumes integration and mutual connectivity to both financial and non-financial data. The main interest in the concept of integrated reporting comes from the value it contributes to the organization. Undoubtedly, the concept of integrated reporting is a milestone in the evolution of organizational reporting. It is however important to consider whether it adequately addresses the information needs of a wide range of stakeholders, and whether it is a universal tool for communication between the company and its stakeholders. The aim of the paper is to discuss the advantages and disadvantages of the concept of integrated reporting as a tool for communication with stakeholders and to further directions of its development. The article uses the research methods such as literature analysis, the content analysis of the corporate publications and comparative analysis.

  9. Decision support tool to evaluate alternative policies regulating wind integration into autonomous energy systems

    International Nuclear Information System (INIS)

    Zouros, N.; Contaxis, G.C.; Kabouris, J.

    2005-01-01

    Integration of wind power into autonomous electricity systems strongly depends on the specific technical characteristics of these systems; the regulations applied should take into account physical system constraints. Introduction of market rules makes the issue even more complicated since the interests of the market participants often conflict each other. In this paper, an integrated tool for the comparative assessment of alternative regulatory policies is presented along with a methodology for decision-making, based on alternative scenarios analysis. The social welfare concept is followed instead of the traditional Least Cost Planning

  10. Risk Jyouhou Navi (risk information navigator). Web tool for fostering of risk literacy. Set of data

    International Nuclear Information System (INIS)

    Mitsui, Seiichiro

    2003-06-01

    In addition to the conventional public understanding activities, Risk communication study team of Japan Nuclear Cycle Development Institutes (JNC) Tokai Works has started practical studies to promote risk communication with its local communities. Since its establishment in 2001, Risk communication study team has conducted analyses of already available results of public attitude surveys, case studies of domestic and overseas risk communication activities, and development of risk communication tools. A web tool for fostering of risk literacy 'Risk Jyouhou Navi (risk information navigator in English)', was developed as a web content for the official home page of Techno Kouryuu Kan Ricotti (Techno Community Square Ricotti in English)'. The objectives of this content are to provide risk information for public and to provide an electronic platform for promoting risk communication with the local community. To develop 'Risk Jyouhou Navi', the following concepts were considered. 1) To create public interest in risks in daily lives and in global risks. 2) To provide risk knowledge and information. 3) To support risk communication activities in Techno community square ricotti. (author)

  11. TAM 2.0: tool for MicroRNA set analysis.

    Science.gov (United States)

    Li, Jianwei; Han, Xiaofen; Wan, Yanping; Zhang, Shan; Zhao, Yingshu; Fan, Rui; Cui, Qinghua; Zhou, Yuan

    2018-06-06

    With the rapid accumulation of high-throughput microRNA (miRNA) expression profile, the up-to-date resource for analyzing the functional and disease associations of miRNAs is increasingly demanded. We here describe the updated server TAM 2.0 for miRNA set enrichment analysis. Through manual curation of over 9000 papers, a more than two-fold growth of reference miRNA sets has been achieved in comparison with previous TAM, which covers 9945 and 1584 newly collected miRNA-disease and miRNA-function associations, respectively. Moreover, TAM 2.0 allows users not only to test the functional and disease annotations of miRNAs by overrepresentation analysis, but also to compare the input de-regulated miRNAs with those de-regulated in other disease conditions via correlation analysis. Finally, the functions for miRNA set query and result visualization are also enabled in the TAM 2.0 server to facilitate the community. The TAM 2.0 web server is freely accessible at http://www.scse.hebut.edu.cn/tam/ or http://www.lirmed.com/tam2/.

  12. The Anne Frank Haven: A case of an alternative educational program in an integrative Kibbutz setting

    Science.gov (United States)

    Ben-Peretz, Miriam; Giladi, Moshe; Dror, Yuval

    1992-01-01

    The essential features of the programme of the Anne Frank Haven are the complete integration of children from low SES and different cultural backgrounds with Kibbutz children; a holistic approach to education; and the involvement of the whole community in an "open" residential school. After 33 years, it is argued that the experiment has proved successful in absorbing city-born youth in the Kibbutz, enabling at-risk populations to reach significant academic achievements, and ensuring their continued participation in the dominant culture. The basic integration model consists of "layers" of concentric circles, in dynamic interaction. The innermost circle is the class, the learning community. The Kibbutz community and the foster parents form a supportive, enveloping circle, which enables students to become part of the outer community and to intervene in it. A kind of meta-environment, the inter-Kibbutz partnership and the Israeli educational system, influence the program through decision making and guidance. Some of the principles of the Haven — integration, community involvement, a year's induction for all new students, and open residential settings — could be useful for cultures and societies outside the Kibbutz. The real "secret" of success of an alternative educational program is the dedicated, motivated and highly trained staff.

  13. Integrating declarative knowledge programming styles and tools for building expert systems

    Energy Technology Data Exchange (ETDEWEB)

    Barbuceanu, M; Trausan-Matu, S; Molnar, B

    1987-01-01

    The XRL system reported in this paper is an integrated knowledge programming environment whose major research theme is the investigation of declarative knowledge programming styles and features and of the way they can be effectively integrated and used to support AI programming. This investigation is carried out in the context of the structured-object representation paradigm which provides the glue keeping XRL components together. The paper describes several declarative programming styles and associated support tools available in XRL. These include an instantiation system supporting a generalized view of the ubiquous frame installation process, a description based programming system providing a novel declarative programming style which embeds a mathematical oriented description language in the structured object environment and a transformational interpreter for using it, a semantics oriented programming framework which offers a specific semantic construct based approach supporting maintenance and evolution and a self description and self generation tool which applies the latter approach to XRL itself. 29 refs., 16 figs.

  14. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  15. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  16. Multisource data set integration and characterization of uranium mineralization for the Montrose Quadrangle, Colorado

    International Nuclear Information System (INIS)

    Bolivar, S.L.; Balog, S.H.; Campbell, K.; Fugelso, L.E.; Weaver, T.A.; Wecksung, G.W.

    1981-04-01

    Several data-classification schemes were developed by the Los Alamos National Laboratory to detect potential uranium mineralization in the Montrose 1 0 x 2 0 quadrangle, Colorado. A first step was to develop and refine the techniques necessary to digitize, integrate, and register various large geological, geochemical, and geophysical data sets, including Landsat 2 imagery, for the Montrose quadrangle, Colorado, using a grid resolution of 1 km. All data sets for the Montrose quadrangle were registered to the Universal Transverse Mercator projection. The data sets include hydrogeochemical and stream sediment analyses for 23 elements, uranium-to-thorium ratios, airborne geophysical survey data, the locations of 90 uranium occurrences, a geologic map and Landsat 2 (bands 4 through 7) imagery. Geochemical samples were collected from 3965 locations in the 19 200 km 2 quadrangle; aerial data were collected on flight lines flown with 3 to 5 km spacings. These data sets were smoothed by universal kriging and interpolated to a 179 x 119 rectangular grid. A mylar transparency of the geologic map was prepared and digitized. Locations for the known uranium occurrences were also digitized. The Landsat 2 imagery was digitally manipulated and rubber-sheet transformed to quadrangle boundaries and bands 4 through 7 were resampled to both a 1-km and 100-m resolution. All possible combinations of three, for all data sets, were examined for general geologic correlations by utilizing a color microfilm output. Subsets of data were further examined for selected test areas. Two classification schemes for uranium mineralization, based on selected test areas in both the Cochetopa and Marshall Pass uranium districts, are presented. Areas favorable for uranium mineralization, based on these schemes, were identified and are discussed

  17. Laparohysteroscopy in female infertility: A diagnostic cum therapeutic tool in Indian setting.

    Science.gov (United States)

    Puri, Suman; Jain, Dinesh; Puri, Sandeep; Kaushal, Sandeep; Deol, Satjeet Kaur

    2015-01-01

    To evaluate the role of laparohysteroscopy in female infertility andto study the effect of therapeutic procedures in achieving fertility. Patients with female infertility presenting to outpatient Department of Obstetrics and Gynecology were evaluated over a period of 18 months. Fifty consenting subjects excluding male factor infertility with normal hormonal profile and no contraindication to laparoscopy were subject to diagnostic laparoscopy and hysteroscopy. T-test. We studied 50 patients comprising of 24 (48%) cases of primary infertility and 26 (52%) patients of secondary infertility. The average age of active married life for 50 patients was between 8 and 9 years. In our study, the most commonly found pathologies were PCOD, endometroisis and tubal blockage. 11 (28.2) patients conceived after laparohysteroscopy followed by artificial reproductive techniques. This study demonstrates the benefit of laparohysteroscopy for diagnosis and as a therapeutic tool in patients with primary and secondary infertility. We were able to achieve a higher conception rate of 28.2%.

  18. High Resolution Manometry - an underappreciated tool for examination of dysphagia in a surgical setting

    DEFF Research Database (Denmark)

    Jensen, Jonas Sanberg

    Introduction Examination of dysphagia in Danish surgical departments, rely primarily on upper gastrointestinal endoscopy. When no visible or histological cause can be detected, esophageal motility disorders are important differential diagnosis. In examining these disorders and in evaluating...... gastroesophageal reflux disorder (GERD), High Resolution Esophageal Manometry (HRM), provide valuable insights. The purpose of this study was to examine referrals and final diagnosis from HRM in a surgical center specializing in esophageal disorders. Methods and Procedures All patients referred to HRM at our.......1% based on 10419 endoscopies. Conclusion HRM is an important diagnostic tool and supplements upper gastrointestinal endoscopy in examination of dysphagia as well as GERD, with significant differences in patterns of motility disorders. Knowledge and availability of HRM increases use at a surgical center...

  19. Development of a Physical Environmental Observational Tool for Dining Environments in Long-Term Care Settings.

    Science.gov (United States)

    Chaudhury, Habib; Keller, Heather; Pfisterer, Kaylen; Hung, Lillian

    2017-11-10

    This paper presents the first standardized physical environmental assessment tool titled Dining Environment Audit Protocol (DEAP) specifically designed for dining spaces in care homes and reports the results of its psychometric properties. Items rated include: adequacy of lighting, glare, personal control, clutter, staff supervision support, restraint use, and seating arrangement option for social interaction. Two scales summarize the prior items and rate the overall homelikeness and functionality of the space. Ten dining rooms in three long-term care homes were selected for assessment. Data were collected over 11 days across 5 weeks. Two trained assessors completed DEAP independently on the same day. Interrater-reliability was completed for lighting, glare, space, homelike aspects, seating arrangements and the two summary scales, homelikeness and functionality of the space. For categorical measures, measure responses were dichotomized at logical points and Cohen's Kappa and concordance on ratings were determined. The two overall rating scales on homelikeness and functionality of space were found to be reliable intraclass correlation coefficient (ICC) (~0.7). The mean rating for homelikeness for Assessor 1 was 3.5 (SD 1.35) and for functionality of the room was 5.3. (SD 0.82; median 5.5). The findings indicate that the tool's interrater-reliability scores are promising. The high concordance on the overall scores for homelikeness and functionality is indicative of the strength of the individual items in generating a reliable global assessment score on these two important aspects of the dining space. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. An integrated risk assessment tool for team-based periodontal disease management.

    Science.gov (United States)

    Thyvalikakath, Thankam P; Padman, Rema; Gupta, Sugandh

    2013-01-01

    Mounting evidence suggests a potential association of periodontal disease with systemic diseases such as diabetes, cardiovascular disease, cancer and stroke. The objective of this study is to develop an integrated risk assessment tool that displays a patients' risk for periodontal disease in the context of their systemic disease, social habits and oral health. Such a tool will be used by not just dental professionals but also by care providers who participate in the team-based care for chronic disease management. Displaying relationships between risk factors and its influence on the patient's general health could be a powerful educational and disease management tool for patients and clinicians. It may also improve the coordination of care provided by the provider-members of a chronic care team.

  1. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  2. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  3. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Science.gov (United States)

    Siegelmann-Danieli, Nava; Farkash, Ariel; Katzir, Itzhak; Vesterman Landes, Janet; Rotem Rabinovich, Hadas; Lomnicky, Yossef; Carmeli, Boaz; Parush-Shear-Yashuv, Naama

    2016-01-01

    Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC) setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes. The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological) per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP), FP plus oxaliplatin (FP-O), or FP plus irinotecan (FP-I) in the first-line between 9/2006 and 12/2013. The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group). For the entire cohort, median overall survival (OS) was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT) pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p controlling for age and gender) identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton-pump inhibitors. Our tool provided insights that confirmed/complemented information gained from randomized-clinical trials. Prospective tool implementation is warranted.

  4. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  5. The L3+C detector, a unique tool-set to study cosmic rays

    International Nuclear Information System (INIS)

    Adriani, O.; Akker, M. van den; Banerjee, S.; Baehr, J.; Betev, B.; Bourilkov, D.; Bottai, S.; Bobbink, G.; Cartacci, A.; Chemarin, M.; Chen, G.; Chen, H.S.; Chiarusi, T.; Dai, C.J.; Ding, L.K.; Duran, I.; Faber, G.; Fay, J.; Grabosch, H.J.; Groenstege, H.; Guo, Y.N.; Gupta, S.; Haller, Ch.; Hayashi, Y.; He, Z.X.; Hebbeker, T.; Hofer, H.; Hoferjun, H.; Huo, A.X.; Ito, N.; Jing, C.L.; Jones, L.; Kantserov, V.; Kawakami, S.; Kittel, W.; Koenig, A.C.; Kok, E.; Korn, A.; Kuang, H.H.; Kuijpers, J.; Ladron de Guevara, P.; Le Coultre, P.; Lei, Y.; Leich, H.; Leiste, R.; Li, D.; Li, L.; Li, Z.C.; Liu, Z.A.; Liu, H.T.; Lohmann, W.; Lu, Y.S.; Ma, X.H.; Ma, Y.Q.; Mil, A. van; Monteleoni, B.; Nahnhauer, R.; Pauss, F.; Parriaud, J.-F.; Petersen, B.; Pohl, M.; Qing, C.R.; Ramelli, R.; Ravindran, K.C.; Rewiersma, P.; Rojkov, A.; Saidi, R.; Schmitt, V.; Schoeneich, B.; Schotanus, D.J.; Shen, C.Q.; Sulanke, H.; Tang, X.W.; Timmermans, C.; Tonwar, S.; Trowitzsch, G.; Unger, M.; Verkooijen, H.; Wang, X.L.; Wang, X.W.; Wang, Z.M.; Wijk, R. van; Wijnen, Th.A.M.; Wilkens, H.; Xu, Y.P.; Xu, Z.Z.; Yang, C.G.; Yang, X.F.; Yao, Z.G.; Yu, Z.Q.; Zhang, S.; Zhu, G.Y.; Zhu, Q.Q.; Zhuang, H.L.; Zwart, A.N.M.

    2002-01-01

    The L3 detector at the CERN electron-positron collider, LEP, has been employed for the study of cosmic ray muons. The muon spectrometer of L3 consists of a set of high-precision drift chambers installed inside a magnet with a volume of about 1000 m 3 and a field of 0.5 T. Muon momenta are measured with a resolution of a few percent at 50 GeV. The detector is located under 30 m of overburden. A scintillator air shower array of 54 m by 30 m is installed on the roof of the surface hall above L3 in order to estimate the energy and the core position of the shower associated with a sample of detected muons. Thanks to the unique properties of the L3+C detector, muon research topics relevant to various current problems in cosmic ray and particle astrophysics can be studied

  6. The L3+C detector, a unique tool-set to study cosmic rays

    CERN Document Server

    Adriani, O; Banerjee, S; Bähr, J; Betev, B L; Bourilkov, D; Bottai, S; Bobbink, Gerjan J; Cartacci, A M; Chemarin, M; Chen, G; Chen He Sheng; Chiarusi, T; Dai Chang Jiang; Ding, L K

    2002-01-01

    The L3 detector at the CERN electron-positron collider, LEP, has been employed for the study of cosmic ray muons. The muon spectrometer of L3 consists of a set of high-precision drift chambers installed inside a magnet with a volume of about 1000 m**3 and a field of 0.5 T. Muon momenta are measured with a resolution of a few percent at 50 GeV. The detector is located under 30 m of overburden. A scintillator air shower array of 54 m by 30 m is installed on the roof of the surface hall above L3 in order to estimate the energy and the core position of the shower associated with a sample of detected muons. Thanks to the unique properties of the L3+C detector, muon research topics relevant to various current problems in cosmic ray and particle astrophysics can be studied.

  7. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting

    Directory of Open Access Journals (Sweden)

    Francisco José Cereceda-Sánchez

    Full Text Available ABSTRACT Objective: to evaluate the usefulness of capnography for the detection of metabolic changes in spontaneous breathing patients, in the emergency and intensive care settings. Methods: in-depth and structured bibliographical search in the databases EBSCOhost, Virtual Health Library, PubMed, Cochrane Library, among others, identifying studies that assessed the relationship between capnography values and the variables involved in blood acid-base balance. Results: 19 studies were found, two were reviews and 17 were observational studies. In nine studies, capnography values were correlated with carbon dioxide (CO2, eight with bicarbonate (HCO3, three with lactate, and four with blood pH. Conclusions: most studies have found a good correlation between capnography values and blood biomarkers, suggesting the usefulness of this parameter to detect patients at risk of severe metabolic change, in a fast, economical and accurate way.

  8. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  9. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    Science.gov (United States)

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. U.S. Geological Survey community for data integration: data upload, registry, and access tool

    Science.gov (United States)

    ,

    2012-01-01

    As a leading science and information agency and in fulfillment of its mission to provide reliable scientific information to describe and understand the Earth, the U.S. Geological Survey (USGS) ensures that all scientific data are effectively hosted, adequately described, and appropriately accessible to scientists, collaborators, and the general public. To succeed in this task, the USGS established the Community for Data Integration (CDI) to address data and information management issues affecting the proficiency of earth science research. Through the CDI, the USGS is providing data and metadata management tools, cyber infrastructure, collaboration tools, and training in support of scientists and technology specialists throughout the project life cycle. One of the significant tools recently created to contribute to this mission is the Uploader tool. This tool allows scientists with limited data management resources to address many of the key aspects of the data life cycle: the ability to protect, preserve, publish and share data. By implementing this application inside ScienceBase, scientists also can take advantage of other collaboration capabilities provided by the ScienceBase platform.

  11. Identification and implementation of end-user needs during development of a state-of-the-art modeling tool-set - 59069

    International Nuclear Information System (INIS)

    Seitz, Roger; Williamson, Mark; Gerdes, Kurt; Freshley, Mark; Dixon, Paul; Collazo, Yvette T.; Hubbard, Susan

    2012-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management, Technology Innovation and Development is supporting a multi-National Laboratory effort to develop the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is an emerging state-of-the-art scientific approach and software infrastructure for understanding and predicting contaminant fate and transport in natural and engineered systems. These modular and open-source high performance computing tools and user interfaces will facilitate integrated approaches that enable standardized assessments of performance and risk for EM cleanup and closure decisions. The ASCEM team recognized that engaging end-users in the ASCEM development process would lead to enhanced development and implementation of the ASCEM tool-sets in the user community. End-user involvement in ASCEM covers a broad spectrum of perspectives, including: performance assessment (PA) and risk assessment practitioners, research scientists, decision-makers, oversight personnel, and regulators engaged in the US DOE cleanup mission. End-users are primarily engaged in ASCEM via the ASCEM User Steering Committee (USC) and the 'user needs interface' task. Future plans also include user involvement in demonstrations of the ASCEM tools. This paper will describe the details of how end users have been engaged in the ASCEM program and will demonstrate how this involvement has strengthened both the tool development and community confidence. ASCEM tools requested by end-users specifically target modeling challenges associated with US DOE cleanup activities. The demonstration activities involve application of ASCEM tools and capabilities to representative problems at DOE sites. Selected results from the ASCEM Phase 1 demonstrations are discussed to illustrate how capabilities requested by end-users were implemented in prototype versions of the ASCEM tool. The ASCEM team engaged a variety of interested parties early in the development

  12. Integrated model for pricing, delivery time setting, and scheduling in make-to-order environments

    Science.gov (United States)

    Garmdare, Hamid Sattari; Lotfi, M. M.; Honarvar, Mahboobeh

    2018-03-01

    Usually, in make-to-order environments which work only in response to the customer's orders, manufacturers for maximizing the profits should offer the best price and delivery time for an order considering the existing capacity and the customer's sensitivity to both the factors. In this paper, an integrated approach for pricing, delivery time setting and scheduling of new arrival orders are proposed based on the existing capacity and accepted orders in system. In the problem, the acquired market demands dependent on the price and delivery time of both the manufacturer and its competitors. A mixed-integer non-linear programming model is presented for the problem. After converting to a pure non-linear model, it is validated through a case study. The efficiency of proposed model is confirmed by comparing it to both the literature and the current practice. Finally, sensitivity analysis for the key parameters is carried out.

  13. FunGeneNet: a web tool to estimate enrichment of functional interactions in experimental gene sets.

    Science.gov (United States)

    Tiys, Evgeny S; Ivanisenko, Timofey V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2018-02-09

    Estimation of functional connectivity in gene sets derived from genome-wide or other biological experiments is one of the essential tasks of bioinformatics. A promising approach for solving this problem is to compare gene networks built using experimental gene sets with random networks. One of the resources that make such an analysis possible is CrossTalkZ, which uses the FunCoup database. However, existing methods, including CrossTalkZ, do not take into account individual types of interactions, such as protein/protein interactions, expression regulation, transport regulation, catalytic reactions, etc., but rather work with generalized types characterizing the existence of any connection between network members. We developed the online tool FunGeneNet, which utilizes the ANDSystem and STRING to reconstruct gene networks using experimental gene sets and to estimate their difference from random networks. To compare the reconstructed networks with random ones, the node permutation algorithm implemented in CrossTalkZ was taken as a basis. To study the FunGeneNet applicability, the functional connectivity analysis of networks constructed for gene sets involved in the Gene Ontology biological processes was conducted. We showed that the method sensitivity exceeds 0.8 at a specificity of 0.95. We found that the significance level of the difference between gene networks of biological processes and random networks is determined by the type of connections considered between objects. At the same time, the highest reliability is achieved for the generalized form of connections that takes into account all the individual types of connections. By taking examples of the thyroid cancer networks and the apoptosis network, it is demonstrated that key participants in these processes are involved in the interactions of those types by which these networks differ from random ones. FunGeneNet is a web tool aimed at proving the functionality of networks in a wide range of sizes of

  14. SEPHYDRO: An Integrated Multi-Filter Web-Based Tool for Baseflow Separation

    Science.gov (United States)

    Serban, D.; MacQuarrie, K. T. B.; Popa, A.

    2017-12-01

    Knowledge of baseflow contributions to streamflow is important for understanding watershed scale hydrology, including groundwater-surface water interactions, impact of geology and landforms on baseflow, estimation of groundwater recharge rates, etc. Baseflow (or hydrograph) separation methods can be used as supporting tools in many areas of environmental research, such as the assessment of the impact of agricultural practices, urbanization and climate change on surface water and groundwater. Over the past few decades various digital filtering and graphically-based methods have been developed in an attempt to improve the assessment of the dynamics of the various sources of streamflow (e.g. groundwater, surface runoff, subsurface flow); however, these methods are not available under an integrated platform and, individually, often require significant effort for implementation. Here we introduce SEPHYDRO, an open access, customizable web-based tool, which integrates 11 algorithms allowing for separation of streamflow hydrographs. The streamlined interface incorporates a reference guide as well as additional information that allows users to import their own data, customize the algorithms, and compare, visualise and export results. The tool includes one-, two- and three-parameter digital filters as well as graphical separation methods and has been successfully applied in Atlantic Canada, in studies dealing with nutrient loading to fresh water and coastal water ecosystems. Future developments include integration of additional separation algorithms as well as incorporation of geochemical separation methods. SEPHYDRO has been developed through a collaborative research effort between the Canadian Rivers Institute, University of New Brunswick (Fredericton, New Brunswick, Canada), Agriculture and Agri-Food Canada and Environment and Climate Change Canada and is currently available at http://canadianriversinstitute.com/tool/

  15. Geologic mapping of the Hekla volcano (Iceland) using integrated data sets from optic and radar sensors

    Science.gov (United States)

    Wever, Tobias; Loercher, Gerhard

    1994-12-01

    During the MAC-Europe campaign in June/July 1991 different airborne data sets (AIRSAR, TMS and AVIRIS) were collected over Iceland. One test site is situated around the Hekla-volcano in South Iceland. This area is characterised by a sequence of lava flows of different ages together with tuffs and ashes. This case study shall contribute to demonstrate the potential of MAC-Europe data for geological mapping. The optical- and the SAR data was analysed separately to elaborate the preferences of the different sensors. An approach was carried out to process an image representing the advantages of the respective sensors in only one presentation. The synergetic approach improves the separation of geological units clearly by combination of two completely different data sets due to the utilisation of spectral bands in the visible and infrared region on one side and on the other side in the microwave region. Beside the petrographical information extracted from optical data using spectral signatures the combination includes physical information like roughness and dielectricity of a target. The geologic setting of the test area is characterised by a very uniform petrography hence the spectral signatures are showing only little variations. Due to this fact, the differentiation of geological units using optical data is limited. The additional use of SAR data establishes the new dimension of the surface roughness which improves the discrimination clearly. This additional parameter presents a new information tool about the state of weathering, age and sequence of the different lava flows. The NASA/JPL AIRSAR system is very suitable for this kind of investigation due to its multifrequency and polarimetric capabilities. The three SAR frequencies (C-, L- and P-Band) enable the detection of a broad range of roughness differences. These results can be enhanced by comprising the full scattering matrix of the polarimetric AIRSAR data.

  16. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools

  17. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  18. Experimental investigation into effect of cutting parameters on surface integrity of hardened tool steel

    Science.gov (United States)

    Bashir, K.; Alkali, A. U.; Elmunafi, M. H. S.; Yusof, N. M.

    2018-04-01

    Recent trend in turning hardened materials have gained popularity because of its immense machinability benefits. However, several machining processes like thermal assisted machining and cryogenic machining have reveal superior machinability benefits over conventional dry turning of hardened materials. Various engineering materials have been studied. However, investigations on AISI O1 tool steel have not been widely reported. In this paper, surface finish and surface integrity dominant when hard turning AISI O1 tool steel is analysed. The study is focused on the performance of wiper coated ceramic tool with respect to surface roughness and surface integrity of hardened tool steel. Hard turned tool steel was machined at varying cutting speed of 100, 155 and 210 m/min and feed rate of 0.05, 0.125 and 0.20mm/rev. The depth of cut of 0.2mm was maintained constant throughout the machining trials. Machining was conducted using dry turning on 200E-axis CNC lathe. The experimental study revealed that the surface finish is relatively superior at higher cutting speed of 210m/min. The surface finish increases when cutting speed increases whereas surface finish is generally better at lower feed rate of 0.05mm/rev. The experimental study conducted have revealed that phenomena such as work piece vibration due to poor or improper mounting on the spindle also contributed to higher surface roughness value of 0.66Ra during turning at 0.2mm/rev. Traces of white layer was observed when viewed with optical microscope which shows evidence of cutting effects on the turned work material at feed rate of 0.2 rev/min

  19. Validation of Nurse Practitioner Primary Care Organizational Climate Questionnaire: A New Tool to Study Nurse Practitioner Practice Settings.

    Science.gov (United States)

    Poghosyan, Lusine; Chaplin, William F; Shaffer, Jonathan A

    2017-04-01

    Favorable organizational climate in primary care settings is necessary to expand the nurse practitioner (NP) workforce and promote their practice. Only one NP-specific tool, the Nurse Practitioner Primary Care Organizational Climate Questionnaire (NP-PCOCQ), measures NP organizational climate. We confirmed NP-PCOCQ's factor structure and established its predictive validity. A crosssectional survey design was used to collect data from 314 NPs in Massachusetts in 2012. Confirmatory factor analysis and regression models were used. The 4-factor model characterized NP-PCOCQ. The NP-PCOCQ score predicted job satisfaction (beta = .36; p organizational climate in their clinics. Further testing of NP-PCOCQ is needed.

  20. Vertical integration in medical settings: A brief introduction to its potential effects on professional psychology.

    Science.gov (United States)

    Sumerall, S W; Oehlert, M E; Trent, D D

    1995-12-01

    Vertical integration in medical settings typically involves the merging of independent physicians, physician groups, and hospitals to render an organized health care network. Such systems are considered to be vertical, as they may allow for a seamless continuation of services throughout the range of needs a patient may require. Mergers often result in the redefining of professional services offered in the acquired facility or across the network. As such, mergers have the potential of adversely impacting psychological practices. Professional psychology needs to take a proactive stance in this changing health care landscape. Research regarding empirically validated treatments and effects of psychological interventions on overall health-care costs needs to be properly disseminated to health care administrators to assure their knowledge of the utility of psychological services in the medical setting. Training psychologists to assume leadership positions in health-care institutions, gaining representation on hospital staff boards, and linking psychologists and physicians through collaborative training, to provide improved care, may allow for psychology to influence health care delivery.

  1. The use of an integrated variable fuzzy sets in water resources management

    Science.gov (United States)

    Qiu, Qingtai; Liu, Jia; Li, Chuanzhe; Yu, Xinzhe; Wang, Yang

    2018-06-01

    Based on the evaluation of the present situation of water resources and the development of water conservancy projects and social economy, optimal allocation of regional water resources presents an increasing need in the water resources management. Meanwhile it is also the most effective way to promote the harmonic relationship between human and water. In view of the own limitations of the traditional evaluations of which always choose a single index model using in optimal allocation of regional water resources, on the basis of the theory of variable fuzzy sets (VFS) and system dynamics (SD), an integrated variable fuzzy sets model (IVFS) is proposed to address dynamically complex problems in regional water resources management in this paper. The model is applied to evaluate the level of the optimal allocation of regional water resources of Zoucheng in China. Results show that the level of allocation schemes of water resources ranging from 2.5 to 3.5, generally showing a trend of lower level. To achieve optimal regional management of water resources, this model conveys a certain degree of accessing water resources management, which prominently improve the authentic assessment of water resources management by using the eigenvector of level H.

  2. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. An efficient tool for metabolic pathway construction and gene integration for Aspergillus niger.

    Science.gov (United States)

    Sarkari, Parveen; Marx, Hans; Blumhoff, Marzena L; Mattanovich, Diethard; Sauer, Michael; Steiger, Matthias G

    2017-12-01

    Metabolic engineering requires functional genetic tools for easy and quick generation of multiple pathway variants. A genetic engineering toolbox for A. niger is presented, which facilitates the generation of strains carrying heterologous expression cassettes at a defined genetic locus. The system is compatible with Golden Gate cloning, which facilitates the DNA construction process and provides high design flexibility. The integration process is mediated by a CRISPR/Cas9 strategy involving the cutting of both the genetic integration locus (pyrG) as well as the integrating plasmid. Only a transient expression of Cas9 is necessary and the carrying plasmid is readily lost using a size-reduced AMA1 variant. A high integration efficiency into the fungal genome of up to 100% can be achieved, thus reducing the screening process significantly. The feasibility of the approach was demonstrated by the integration of an expression cassette enabling the production of aconitic acid in A. niger. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. A Conceptual Framework for Integration of Evidence-Based Design with Lighting Simulation Tools

    Directory of Open Access Journals (Sweden)

    Anahita Davoodi

    2017-09-01

    Full Text Available The use of lighting simulation tools has been growing over the past years which has improved lighting analysis. While computer simulations have proven to be a viable tool for analyzing lighting in physical environments, they have difficulty in assessing the effects of light on occupant’s perception. Evidence-based design (EBD is a design method that is gaining traction in building design due to its strength in providing means to assess the effects of built environments on humans. The aim of this study was to develop a conceptual framework for integrating EBD with lighting simulation tools. Based on a literature review, it was investigated how EBD and lighting simulation can be combined to provide a holistic lighting performance evaluation method. The results show that they can mutually benefit from each other. EBD makes it possible to evaluate and/or improve performance metrics by utilizing user feedback. On the other hand, performance metrics can be used for a better description of evidence, and to analyze the effects of lighting with more details. The results also show that EBD can be used to evaluate light simulations to better understand when and how they should be performed. A framework is presented for integration of lighting simulation and EBD.

  5. Day 1 for the Integrated Multi-Satellite Retrievals for GPM (IMERG) Data Sets

    Science.gov (United States)

    Huffman, G. J.; Bolvin, D. T.; Braithwaite, D.; Hsu, K. L.; Joyce, R.; Kidd, C.; Sorooshian, S.; Xie, P.

    2014-12-01

    The Integrated Multi-satellitE Retrievals for GPM (IMERG) is designed to compute the best time series of (nearly) global precipitation from "all" precipitation-relevant satellites and global surface precipitation gauge analyses. IMERG was developed to use GPM Core Observatory data as a reference for the international constellation of satellites of opportunity that constitute the GPM virtual constellation. Computationally, IMERG is a unified U.S. algorithm drawing on strengths in the three contributing groups, whose previous work includes: 1) the TRMM Multi-satellite Precipitation Analysis (TMPA); 2) the CPC Morphing algorithm with Kalman Filtering (K-CMORPH); and 3) the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS). We review the IMERG design, development, testing, and current status. IMERG provides 0.1°x0.1° half-hourly data, and will be run at multiple times, providing successively more accurate estimates: 4 hours, 8 hours, and 2 months after observation time. In Day 1 the spatial extent is 60°N-S, for the period March 2014 to the present. In subsequent reprocessing the data will extend to fully global, covering the period 1998 to the present. Both the set of input data set retrievals and the IMERG system are substantially different than those used in previous U.S. products. The input passive microwave data are all being produced with GPROF2014, which is substantially upgraded compared to previous versions. For the first time, this includes microwave sounders. Accordingly, there is a strong need to carefully check the initial test data sets for performance. IMERG output will be illustrated using pre-operational test data, including the variety of supporting fields, such as the merged-microwave and infrared estimates, and the precipitation type. Finally, we will summarize the expected release of various output products, and the subsequent reprocessing sequence.

  6. Integrating Soft Set Theory and Fuzzy Linguistic Model to Evaluate the Performance of Training Simulation Systems.

    Science.gov (United States)

    Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu

    2016-01-01

    The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance-performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system.

  7. Reliability and validity of a novel tool to comprehensively assess food and beverage marketing in recreational sport settings.

    Science.gov (United States)

    Prowse, Rachel J L; Naylor, Patti-Jean; Olstad, Dana Lee; Carson, Valerie; Mâsse, Louise C; Storey, Kate; Kirk, Sara F L; Raine, Kim D

    2018-05-31

    Current methods for evaluating food marketing to children often study a single marketing channel or approach. As the World Health Organization urges the removal of unhealthy food marketing in children's settings, methods that comprehensively explore the exposure and power of food marketing within a setting from multiple marketing channels and approaches are needed. The purpose of this study was to test the inter-rater reliability and the validity of a novel settings-based food marketing audit tool. The Food and beverage Marketing Assessment Tool for Settings (FoodMATS) was developed and its psychometric properties evaluated in five public recreation and sport facilities (sites) and subsequently used in 51 sites across Canada for a cross-sectional analysis of food marketing. Raters recorded the count of food marketing occasions, presence of child-targeted and sports-related marketing techniques, and the physical size of marketing occasions. Marketing occasions were classified by healthfulness. Inter-rater reliability was tested using Cohen's kappa (κ) and intra-class correlations (ICC). FoodMATS scores for each site were calculated using an algorithm that represented the theoretical impact of the marketing environment on food preferences, purchases, and consumption. Higher FoodMATS scores represented sites with higher exposure to, and more powerful (unhealthy, child-targeted, sports-related, large) food marketing. Validity of the scoring algorithm was tested through (1) Pearson's correlations between FoodMATS scores and facility sponsorship dollars, and (2) sequential multiple regression for predicting "Least Healthy" food sales from FoodMATS scores. Inter-rater reliability was very good to excellent (κ = 0.88-1.00, p marketing in recreation facilities, the FoodMATS provides a novel means to comprehensively track changes in food marketing environments that can assist in developing and monitoring the impact of policies and interventions.

  8. Capnography as a tool to detect metabolic changes in patients cared for in the emergency setting.

    Science.gov (United States)

    Cereceda-Sánchez, Francisco José; Molina-Mula, Jesús

    2017-05-15

    to evaluate the usefulness of capnography for the detection of metabolic changes in spontaneous breathing patients, in the emergency and intensive care settings. in-depth and structured bibliographical search in the databases EBSCOhost, Virtual Health Library, PubMed, Cochrane Library, among others, identifying studies that assessed the relationship between capnography values and the variables involved in blood acid-base balance. 19 studies were found, two were reviews and 17 were observational studies. In nine studies, capnography values were correlated with carbon dioxide (CO2), eight with bicarbonate (HCO3), three with lactate, and four with blood pH. most studies have found a good correlation between capnography values and blood biomarkers, suggesting the usefulness of this parameter to detect patients at risk of severe metabolic change, in a fast, economical and accurate way. avaliar a utilidade da capnografia para a detecção de alterações metabólicas em pacientes com respiração espontânea, no contexto das emergências e dos cuidados intensivos. pesquisa bibliográfica estruturada aprofundada, nas bases de dados EBSCOhost, Biblioteca Virtual em Saúde, PubMed, Cochrane Library, entre outras, identificando estudos que avaliavam a relação entre os valores da capnografia e as variáveis envolvidas no equilíbrio ácido-base sanguíneo. foram levantados 19 estudos, dois eram revisões e 17 eram estudos observacionais. Em nove estudos, os valores capnográficos foram correlacionados com o dióxido de carbono (CO2), em oito com o bicarbonato (HCO3), em três com o lactato, e em quatro com o pH sanguíneo. na maioria dos estudos foi observada uma correlação adequada entre os valores capnográficos e os biomarcadores sanguíneos, sugerindo a utilidade deste parâmetro para a identificação de pacientes com risco de sofrer uma alteração metabólica grave, de uma forma rápida, econômica e precisa. explorar la utilidad de la capnografía para la detecci

  9. Hypogeal geological survey in the "Grotta del Re Tiberio" natural cave (Apennines, Italy): a valid tool for reconstructing the structural setting

    Science.gov (United States)

    Ghiselli, Alice; Merazzi, Marzio; Strini, Andrea; Margutti, Roberto; Mercuriali, Michele

    2011-06-01

    As karst systems are natural windows to the underground, speleology, combined with geological surveys, can be useful tools for helping understand the geological evolution of karst areas. In order to enhance the reconstruction of the structural setting in a gypsum karst area (Vena del Gesso, Romagna Apennines), a detailed analysis has been carried out on hypogeal data. Structural features (faults, fractures, tectonic foliations, bedding) have been mapped in the "Grotta del Re Tiberio" cave, in the nearby gypsum quarry tunnels and open pit benches. Five fracture systems and six fault systems have been identified. The fault systems have been further analyzed through stereographic projections and geometric-kinematic evaluations in order to reconstruct the relative chronology of these structures. This analysis led to the detection of two deformation phases. The results permitted linking of the hypogeal data with the surface data both at a local and regional scale. At the local scale, fracture data collected in the underground have been compared with previous authors' surface data coming from the quarry area. The two data sets show a very good correspondence, as every underground fracture system matches with one of the surface fracture system. Moreover, in the cave, a larger number of fractures belonging to each system could be mapped. At the regional scale, the two deformation phases detected can be integrated in the structural setting of the study area, thereby enhancing the tectonic interpretation of the area ( e.g., structures belonging to a new deformation phase, not reported before, have been identified underground). The structural detailed hypogeal survey has, thus, provided very useful data, both by integrating the existing information and revealing new data not detected at the surface. In particular, some small structures ( e.g., displacement markers and short fractures) are better preserved in the hypogeal environment than on the surface where the outcropping

  10. Integrated hydraulic booster/tool string technology for unfreezing of stuck downhole strings in horizontal wells

    Science.gov (United States)

    Tian, Q. Z.

    2017-12-01

    It is common to use a jarring tool to unfreeze stuck downhole string. However, in a horizontal well, influenced by the friction caused by the deviated section, jarring effect is poor; on the other hand, the forcing point can be located in the horizontal section by a hydraulic booster and the friction can be reduced, but it is time-consuming and easy to break downhole string using a large-tonnage and constant pull force. A hydraulic booster - jar tool string has been developed for unfreezing operation in horizontal wells. The technical solution involves three elements: a two-stage parallel spring cylinder structure for increasing the energy storage capacity of spring accelerators; multiple groups of spring accelerators connected in series to increase the working stroke; a hydraulic booster intensifying jarring force. The integrated unfreezing tool string based on these three elements can effectively overcome the friction caused by a deviated borehole, and thus unfreeze a stuck string with the interaction of the hydraulic booster and the mechanical jar which form an alternatively dynamic load. Experimental results show that the jarring performance parameters of the hydraulic booster-jar unfreezing tool string for the horizontal wells are in accordance with original design requirements. Then field technical parameters were developed based on numerical simulation and experimental data. Field application shows that the hydraulic booster-jar unfreezing tool string is effective to free stuck downhole tools in a horizontal well, and it reduces hook load by 80% and lessens the requirement of workover equipment. This provides a new technology to unfreeze stuck downhole string in a horizontal well.

  11. VarB Plus: An Integrated Tool for Visualization of Genome Variation Datasets

    KAUST Repository

    Hidayah, Lailatul

    2012-07-01

    Research on genomic sequences has been improving significantly as more advanced technology for sequencing has been developed. This opens enormous opportunities for sequence analysis. Various analytical tools have been built for purposes such as sequence assembly, read alignments, genome browsing, comparative genomics, and visualization. From the visualization perspective, there is an increasing trend towards use of large-scale computation. However, more than power is required to produce an informative image. This is a challenge that we address by providing several ways of representing biological data in order to advance the inference endeavors of biologists. This thesis focuses on visualization of variations found in genomic sequences. We develop several visualization functions and embed them in an existing variation visualization tool as extensions. The tool we improved is named VarB, hence the nomenclature for our enhancement is VarB Plus. To the best of our knowledge, besides VarB, there is no tool that provides the capability of dynamic visualization of genome variation datasets as well as statistical analysis. Dynamic visualization allows users to toggle different parameters on and off and see the results on the fly. The statistical analysis includes Fixation Index, Relative Variant Density, and Tajima’s D. Hence we focused our efforts on this tool. The scope of our work includes plots of per-base genome coverage, Principal Coordinate Analysis (PCoA), integration with a read alignment viewer named LookSeq, and visualization of geo-biological data. In addition to description of embedded functionalities, significance, and limitations, future improvements are discussed. The result is four extensions embedded successfully in the original tool, which is built on the Qt framework in C++. Hence it is portable to numerous platforms. Our extensions have shown acceptable execution time in a beta testing with various high-volume published datasets, as well as positive

  12. Rough Sets as a Knowledge Discovery and Classification Tool for the Diagnosis of Students with Learning Disabilities

    Directory of Open Access Journals (Sweden)

    Yu-Chi Lin

    2011-02-01

    Full Text Available Due to the implicit characteristics of learning disabilities (LDs, the diagnosis of students with learning disabilities has long been a difficult issue. Artificial intelligence techniques like artificial neural network (ANN and support vector machine (SVM have been applied to the LD diagnosis problem with satisfactory outcomes. However, special education teachers or professionals tend to be skeptical to these kinds of black-box predictors. In this study, we adopt the rough set theory (RST, which can not only perform as a classifier, but may also produce meaningful explanations or rules, to the LD diagnosis application. Our experiments indicate that the RST approach is competitive as a tool for feature selection, and it performs better in term of prediction accuracy than other rulebased algorithms such as decision tree and ripper algorithms. We also propose to mix samples collected from sources with different LD diagnosis procedure and criteria. By pre-processing these mixed samples with simple and readily available clustering algorithms, we are able to improve the quality and support of rules generated by the RST. Overall, our study shows that the rough set approach, as a classification and knowledge discovery tool, may have great potential in playing an essential role in LD diagnosis.

  13. Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration

    Science.gov (United States)

    Hariri, Ali

    Solar photovoltaic (PV) energy integration in distribution networks is one of the fastest growing sectors of distributed energy integration. The growth in solar PV integration is incentivized by various clean power policies, global interest in solar energy, and reduction in manufacturing and installation costs of solar energy systems. The increase in solar PV integration has raised a number of concerns regarding the potential impacts that might arise as a result of high PV penetration. Some impacts have already been recorded in networks with high PV penetration such as in China, Germany, and USA (Hawaii and California). Therefore, network planning is becoming more intricate as new technologies are integrated into the existing electric grid. The integrated new technologies pose certain compatibility concerns regarding the existing electric grid infrastructure. Therefore, PV integration impact studies are becoming more essential in order to have a better understanding of how to advance the solar PV integration efforts without introducing adverse impacts into the network. PV impact studies are important for understanding the nature of the new introduced phenomena. Understanding the nature of the potential impacts is a key factor for mitigating and accommodating for said impacts. Traditionally, electric power utilities relied on phasor-based power flow simulations for planning their electric networks. However, the conventional, commercially available, phasor-based simulation tools do not provide proper visibility across a wide spectrum of electric phenomena. Moreover, different types of simulation approaches are suitable for specific types of studies. For instance, power flow software cannot be used for studying time varying phenomena. At the same time, it is not practical to use electromagnetic transient (EMT) tools to perform power flow solutions. Therefore, some electric phenomena caused by the variability of PV generation are not visible using the conventional

  14. To select the best tool for generating 3D maintenance data and to set the detailed process for obtaining the 3D maintenance data

    Science.gov (United States)

    Prashanth, B. N.; Roy, Kingshuk

    2017-07-01

    Three Dimensional (3D) maintenance data provides a link between design and technical documentation creating interactive 3D graphical training and maintenance material. It becomes difficult for an operator to always go through huge paper manuals or come running to the computer for doing maintenance of a machine which makes the maintenance work fatigue. Above being the case, a 3D animation makes maintenance work very simple since, there is no language barrier. The research deals with the generation of 3D maintenance data of any given machine. The best tool for obtaining the 3D maintenance is selected and the tool is analyzed. Using the same tool, a detailed process for extracting the 3D maintenance data for any machine is set. This project aims at selecting the best tool for obtaining 3D maintenance data and to select the detailed process for obtaining 3D maintenance data. 3D maintenance reduces use of big volumes of manuals which creates human errors and makes the work of an operator fatiguing. Hence 3-D maintenance would help in training and maintenance and would increase productivity. 3Dvia when compared with Cortona 3D and Deep Exploration proves to be better than them. 3Dvia is good in data translation and it has the best renderings compared to the other two 3D maintenance software. 3Dvia is very user friendly and it has various options for creating 3D animations. Its Interactive Electronic Technical Publication (IETP) integration is also better than the other two software. Hence 3Dvia proves to be the best software for obtaining 3D maintenance data of any machine.

  15. Regional Energy Planning Tool for Renewable Integrated Low-Energy District Heating Systems

    DEFF Research Database (Denmark)

    Tol, Hakan; Dincer, Ibrahim; Svendsen, Svend

    2013-01-01

    Low-energy district heating systems, operating at low temperature of 55 °C as supply and 25°C as return, can be the energy solution as being the prevailing heating infrastructure in urban areas, considering future energy schemesaiming at increased exploitation of renewable energy sources together...... with low-energy houses in focus with intensified energy efficiency measures. Employing low-temperature operation allows the ease to exploit not only any type of heat source but also low-grade sources, i.e., renewable and industrial waste heat, which would otherwise be lost. In this chapter, a regional...... energy planning tool is described considered with various energy conversion systems based on renewable energy sources to be supplied to an integrated energy infrastructure involving a low-energy district heating, a district cooling, and an electricity grid. The developed tool is performed for two case...

  16. The Virtual UNICOS Process Expert: integration of Artificial Intelligence tools in Control Systems

    CERN Multimedia

    Vilches Calvo, I; Barillere, R

    2009-01-01

    UNICOS is a CERN framework to produce control applications. It provides operators with ways to interact with all process items from the most simple (e.g. I/O channels) to the most abstract objects (e.g. a part of the plant). This possibility of fine grain operation is particularly useful to recover from abnormal situations if operators have the required knowledge. The Virtual UNICOS Process Expert project aims at providing operators with means to handle difficult operation cases for which the intervention of process experts is usually requested. The main idea of project is to use the openness of the UNICOS-based applications to integrate tools (e.g. Artificial Intelligence tools) which will act as Process Experts to analyze complex situations, to propose and to execute smooth recovery procedures.

  17. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    Directory of Open Access Journals (Sweden)

    Sandeep R Panta

    2016-03-01

    Full Text Available In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS. We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed stochastic neighbor embedding (t-SNE algorithm which reduces the number of dimensions for each scan in the input data set to two dimensions while preserving the local structure of data sets. Finally, we interactively display the output of this approach via a web-page, based on data driven documents (D3 JavaScript library. Two distinct approaches were used to visualize the data. In the first approach, we computed multiple quality control (QC values from pre-processed data, which were used as inputs to the t-SNE algorithm. This approach helps in assessing the quality of each data set relative to others. In the second case, computed variables of interest (e.g. brain volume or voxel values from segmented gray matter images were used as inputs to the t-SNE algorithm. This approach helps in identifying interesting patterns in the data sets. We demonstrate these approaches using multiple examples including 1 quality control measures calculated from phantom data over time, 2 quality control data from human functional MRI data across various studies, scanners, sites, 3 volumetric and density measures from human structural MRI data across various studies, scanners and sites. Results from (1 and (2 show the potential of our approach to combine t-SNE data reduction with interactive color coding of variables of interest to quickly identify visually unique clusters of data (i.e. data sets with poor QC, clustering of data by site quickly. Results from (3 demonstrate

  18. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand.

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-12-01

    Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. An eco-friendly dengue vector control programme was successfully implemented in

  19. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-01-01

    Background Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Methodology Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Results Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. Conclusion An eco-friendly dengue vector control

  20. System and method for integrating hazard-based decision making tools and processes

    Science.gov (United States)

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  1. Electronic Dictionary as a Tool for Integration of Additional Learning Content

    Directory of Open Access Journals (Sweden)

    Stefka Kovacheva

    2015-12-01

    Full Text Available Electronic Dictionary as a Tool for Integration of Additional Learning Content This article discusses electronic dictionary as an element of the „Bulgarian cultural and historical heritage under the protection of UNESCO” database developed in IMI (BAS, that will be used to integrate additional learning content. The electronic dictionary is described as an easily accessible book of reference, offering information to the shape, meaning, usage and the origin of words in connection to the cultural-historical heritage sites in Bulgaria, protected by UNESCO. The dictionary targets 9–11 year old students from Bulgarian schools, who study the subjects “Man and Society” in 4th grade and “History and Civilization” in 5th grade.

  2. Web-based management of research groups - using the right tools and an adequate integration strategy

    Energy Technology Data Exchange (ETDEWEB)

    Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de, E-mail: barroso@ipen.b, E-mail: mario@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Grupo de Pesquisa em Gestao do Conhecimento Aplicada a Area Nuclear

    2011-07-01

    Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)

  3. Web-based management of research groups - using the right tools and an adequate integration strategy

    International Nuclear Information System (INIS)

    Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de

    2011-01-01

    Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)

  4. Organizational contextual features that influence the implementation of evidence-based practices across healthcare settings: a systematic integrative review.

    Science.gov (United States)

    Li, Shelly-Anne; Jeffs, Lianne; Barwick, Melanie; Stevens, Bonnie

    2018-05-05

    Organizational contextual features have been recognized as important determinants for implementing evidence-based practices across healthcare settings for over a decade. However, implementation scientists have not reached consensus on which features are most important for implementing evidence-based practices. The aims of this review were to identify the most commonly reported organizational contextual features that influence the implementation of evidence-based practices across healthcare settings, and to describe how these features affect implementation. An integrative review was undertaken following literature searches in CINAHL, MEDLINE, PsycINFO, EMBASE, Web of Science, and Cochrane databases from January 2005 to June 2017. English language, peer-reviewed empirical studies exploring organizational context in at least one implementation initiative within a healthcare setting were included. Quality appraisal of the included studies was performed using the Mixed Methods Appraisal Tool. Inductive content analysis informed data extraction and reduction. The search generated 5152 citations. After removing duplicates and applying eligibility criteria, 36 journal articles were included. The majority (n = 20) of the study designs were qualitative, 11 were quantitative, and 5 used a mixed methods approach. Six main organizational contextual features (organizational culture; leadership; networks and communication; resources; evaluation, monitoring and feedback; and champions) were most commonly reported to influence implementation outcomes in the selected studies across a wide range of healthcare settings. We identified six organizational contextual features that appear to be interrelated and work synergistically to influence the implementation of evidence-based practices within an organization. Organizational contextual features did not influence implementation efforts independently from other features. Rather, features were interrelated and often influenced each

  5. A database and tool, IM Browser, for exploring and integrating emerging gene and protein interaction data for Drosophila

    Directory of Open Access Journals (Sweden)

    Parrish Jodi R

    2006-04-01

    Full Text Available Abstract Background Biological processes are mediated by networks of interacting genes and proteins. Efforts to map and understand these networks are resulting in the proliferation of interaction data derived from both experimental and computational techniques for a number of organisms. The volume of this data combined with the variety of specific forms it can take has created a need for comprehensive databases that include all of the available data sets, and for exploration tools to facilitate data integration and analysis. One powerful paradigm for the navigation and analysis of interaction data is an interaction graph or map that represents proteins or genes as nodes linked by interactions. Several programs have been developed for graphical representation and analysis of interaction data, yet there remains a need for alternative programs that can provide casual users with rapid easy access to many existing and emerging data sets. Description Here we describe a comprehensive database of Drosophila gene and protein interactions collected from a variety of sources, including low and high throughput screens, genetic interactions, and computational predictions. We also present a program for exploring multiple interaction data sets and for combining data from different sources. The program, referred to as the Interaction Map (IM Browser, is a web-based application for searching and visualizing interaction data stored in a relational database system. Use of the application requires no downloads and minimal user configuration or training, thereby enabling rapid initial access to interaction data. IM Browser was designed to readily accommodate and integrate new types of interaction data as it becomes available. Moreover, all information associated with interaction measurements or predictions and the genes or proteins involved are accessible to the user. This allows combined searches and analyses based on either common or technique-specific attributes

  6. Evaluating the Auto-MODS Assay, a Novel Tool for Tuberculosis Diagnosis for Use in Resource-Limited Settings

    Science.gov (United States)

    Wang, Linwei; Mohammad, Sohaib H.; Li, Qiaozhi; Rienthong, Somsak; Rienthong, Dhanida; Nedsuwan, Supalert; Mahasirimongkol, Surakameth; Yasui, Yutaka

    2014-01-01

    There is an urgent need for simple, rapid, and affordable diagnostic tests for tuberculosis (TB) to combat the great burden of the disease in developing countries. The microscopic observation drug susceptibility assay (MODS) is a promising tool to fill this need, but it is not widely used due to concerns regarding its biosafety and efficiency. This study evaluated the automated MODS (Auto-MODS), which operates on principles similar to those of MODS but with several key modifications, making it an appealing alternative to MODS in resource-limited settings. In the operational setting of Chiang Rai, Thailand, we compared the performance of Auto-MODS with the gold standard liquid culture method in Thailand, mycobacterial growth indicator tube (MGIT) 960 plus the SD Bioline TB Ag MPT64 test, in terms of accuracy and efficiency in differentiating TB and non-TB samples as well as distinguishing TB and multidrug-resistant (MDR) TB samples. Sputum samples from clinically diagnosed TB and non-TB subjects across 17 hospitals in Chiang Rai were consecutively collected from May 2011 to September 2012. A total of 360 samples were available for evaluation, of which 221 (61.4%) were positive and 139 (38.6%) were negative for mycobacterial cultures according to MGIT 960. Of the 221 true-positive samples, Auto-MODS identified 212 as positive and 9 as negative (sensitivity, 95.9%; 95% confidence interval [CI], 92.4% to 98.1%). Of the 139 true-negative samples, Auto-MODS identified 135 as negative and 4 as positive (specificity, 97.1%; 95% CI, 92.8% to 99.2%). The median time to culture positivity was 10 days, with an interquartile range of 8 to 13 days for Auto-MODS. Auto-MODS is an effective and cost-sensitive alternative diagnostic tool for TB diagnosis in resource-limited settings. PMID:25378569

  7. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Directory of Open Access Journals (Sweden)

    Nava Siegelmann-Danieli

    Full Text Available Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes.The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP, FP plus oxaliplatin (FP-O, or FP plus irinotecan (FP-I in the first-line between 9/2006 and 12/2013.The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group. For the entire cohort, median overall survival (OS was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p <0.002; notably, these patients were older. Patients who received both FP-O- and FP-I-based regimens achieved numerically longer OS vs. those who received only one of these regimens (22.1 [19.9-24.0] vs. 18.9 [15.5-21.9] months. Among patients assessed for wild-type KRAS and treated with subsequent anti-EGFR agent, OS was 25.4 months and 18.7 months for 124 treated vs. 37 non-treated patients (non-significant. Cox analysis (controlling for age and gender identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton

  8. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  9. Systematic Review: Concept and Tool Development with Application in the Integrated Risk Information System (IRIS) Assessment Process

    Science.gov (United States)

    Systematic Review: Concept and tool development with application to the National Toxicology Program (NTP) and the Integrated Risk Information System (IRIS) Assessment Processes. There is growing interest within the environmental health community to incorporate systematic review m...

  10. An Integrated Tool for Low Thrust Optimal Control Orbit Transfers in Interplanetary Trajectories

    Science.gov (United States)

    Dargent, T.; Martinot, V.

    In the last recent years a significant progress has been made in optimal control orbit transfers using low thrust electrical propulsion for interplanetary missions. The system objective is always the same: decrease the transfer duration and increase the useful satellite mass. The optimum control strategy to perform the minimum time to orbit or the minimum fuel consumption requires the use of sophisticated mathematical tools, most of the time dedicated to a specific mission and therefore hardly reusable. To improve this situation and enable Alcatel Space to perform rather quick trajectory design as requested by mission analysis, we have developed a software tool T-3D dedicated to optimal control orbit transfers which integrates various initial and terminal rendezvous conditions - e.g. fixed arrival time for planet encounter - and engine thrust profiles -e.g. thrust law variation with respect to the distance to the Sun -. This single and quite versatile tool allows to perform analyses like minimum consumption for orbit insertions around a planet from an hyperbolic trajectory, interplanetary orbit transfers, low thrust minimum time multiple revolution orbit transfers, etc… From a mathematical point of view, the software relies on the minimum principle formulation to find the necessary conditions of optimality. The satellite dynamics is a two body model and relies of an equinoctial formulation of the Gauss equation. This choice has been made for numerical purpose and to solve more quickly the two point boundaries values problem. In order to handle the classical problem of co-state variables initialization, problems simpler than the actual one can be solved straight forward by the tool and the values of the co-state variables are kept as first guess for a more complex problem. Finally, a synthesis of the test cases is presented to illustrate the capacities of the tool, mixing examples of interplanetary mission, orbit insertion, multiple revolution orbit transfers

  11. BEopt-CA (Ex): A Tool for Optimal Integration of EE, DR and PV in Existing California Homes

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, Craig [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, Scott [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, Jeff [National Renewable Energy Lab. (NREL), Golden, CO (United States); Velasco, Paulo Tabrares [National Renewable Energy Lab. (NREL), Golden, CO (United States); Springer, David [Davis Energy Group, Davis, CA (United States); Coates, Peter [Davis Energy Group, Davis, CA (United States); Bell, Christy [Davis Energy Group, Davis, CA (United States); Price, Snuller [Energy & Environmental Economics, San Francisco, CA (United States); Sreedharan, Priya [Energy & Environmental Economics, San Francisco, CA (United States); Pickrell, Katie [Energy & Environmental Economics, San Francisco, CA (United States)

    2014-04-01

    This project targeted the development of a software tool, BEopt-CA (Ex) (Building Energy Optimization Tool for California Existing Homes), that aims to facilitate balanced integration of energy efficiency (EE), demand response (DR), and photovoltaics (PV) in the residential retrofit1 market. The intent is to provide utility program managers and contractors in the EE/DR/PV marketplace with a means of balancing the integration of EE, DR, and PV

  12. SOCIOCULTURAL INTEGRATION AS A TOOL FOR CONSTRUCTIVE CONFLICT RESOLUTION: THE CASE OF THE NORTH CAUCASUS

    Directory of Open Access Journals (Sweden)

    M. E. Popov

    2017-01-01

    Full Text Available The paper is devoted to research of sociocultural integration as a tool for resolving regional conflicts. The modern theory of conflict resolution focuses on the ability of the sociocultural integration in the transformation of destructive identity-based conflicts into conflicts of interest. The author considers the systemic factors of the identity-based conflicts and emphasizes destabilizing role of the politicization of ethnicity. Ethnic mobilization, social inequalities, economic polarization and civic identity crisis are structural factors that determine the acuity of ethnic tension and escalation of regional identity conflicts as a result. Contradictions between the modernization system and social disintegration are the primary source of identity conflicts in theNorth Caucasus. Regionalization takes conflictogenic form in this case, i.e. the specifics of regional conflicts is associated with a conflict of static (traditionalization and dynamic (modernization types of social propagation. Structurally, escalation of violence in regional conflicts is determined by the intensity and scope of ethnic mobilization and social dissatisfaction as necessary conditions of a collision. Regional conflicts affect existentially meaningful collective values and group identities, that is why the participants are involved emotionally into identification conflicts; due to their emotional charge and irrationality, identity conflicts are no longer a means of overcoming social frustrations, but a destructive goal in itself, i.e. ethnicity polarization and negative cultural stereotypes in perceiving “the others” play a key role in initiating such conflicts. The following must be considered for discussing anti-conflict mechanisms of sociocultural integration in theNorth Caucasus. First, sociocultural integration is a political project with its content determined to a wide extent by defense challenges of the polyethnic Russian society. Second, development of the

  13. Enhancing adult therapeutic interpersonal relationships in the acute health care setting: an integrative review

    Directory of Open Access Journals (Sweden)

    Kornhaber R

    2016-10-01

    Full Text Available Rachel Kornhaber,1 Kenneth Walsh,1,2 Jed Duff,1,3 Kim Walker1,3 1School of Health Sciences, Faculty of Health, University of Tasmania, Alexandria, NSW, 2Tasmanian Health Services – Southern Region, Hobart, TAS, 3St Vincent’s Private Hospital, Sydney, NSW, Australia Abstract: Therapeutic interpersonal relationships are the primary component of all health care interactions that facilitate the development of positive clinician–patient experiences. Therapeutic interpersonal relationships have the capacity to transform and enrich the patients’ experiences. Consequently, with an increasing necessity to focus on patient-centered care, it is imperative for health care professionals to therapeutically engage with patients to improve health-related outcomes. Studies were identified through an electronic search, using the PubMed, Cumulative Index to Nursing and Allied Health Literature, and PsycINFO databases of peer-reviewed research, limited to the English language with search terms developed to reflect therapeutic interpersonal relationships between health care professionals and patients in the acute care setting. This study found that therapeutic listening, responding to patient emotions and unmet needs, and patient centeredness were key characteristics of strategies for improving therapeutic interpersonal relationships. Keywords: health, acute care, therapeutic interpersonal relationships, relational care integrative review 

  14. An integrated extended Kalman filter–implicit level set algorithm for monitoring planar hydraulic fractures

    International Nuclear Information System (INIS)

    Peirce, A; Rochinha, F

    2012-01-01

    We describe a novel approach to the inversion of elasto-static tiltmeter measurements to monitor planar hydraulic fractures propagating within three-dimensional elastic media. The technique combines the extended Kalman filter (EKF), which predicts and updates state estimates using tiltmeter measurement time-series, with a novel implicit level set algorithm (ILSA), which solves the coupled elasto-hydrodynamic equations. The EKF and ILSA are integrated to produce an algorithm to locate the unknown fracture-free boundary. A scaling argument is used to derive a strategy to tune the algorithm parameters to enable measurement information to compensate for unmodeled dynamics. Synthetic tiltmeter data for three numerical experiments are generated by introducing significant changes to the fracture geometry by altering the confining geological stress field. Even though there is no confining stress field in the dynamic model used by the new EKF-ILSA scheme, it is able to use synthetic data to arrive at remarkably accurate predictions of the fracture widths and footprints. These experiments also explore the robustness of the algorithm to noise and to placement of tiltmeter arrays operating in the near-field and far-field regimes. In these experiments, the appropriate parameter choices and strategies to improve the robustness of the algorithm to significant measurement noise are explored. (paper)

  15. Integrating Genomic Data Sets for Knowledge Discovery: An Informed Approach to Management of Captive Endangered Species

    Directory of Open Access Journals (Sweden)

    Kristopher J. L. Irizarry

    2016-01-01

    Full Text Available Many endangered captive populations exhibit reduced genetic diversity resulting in health issues that impact reproductive fitness and quality of life. Numerous cost effective genomic sequencing and genotyping technologies provide unparalleled opportunity for incorporating genomics knowledge in management of endangered species. Genomic data, such as sequence data, transcriptome data, and genotyping data, provide critical information about a captive population that, when leveraged correctly, can be utilized to maximize population genetic variation while simultaneously reducing unintended introduction or propagation of undesirable phenotypes. Current approaches aimed at managing endangered captive populations utilize species survival plans (SSPs that rely upon mean kinship estimates to maximize genetic diversity while simultaneously avoiding artificial selection in the breeding program. However, as genomic resources increase for each endangered species, the potential knowledge available for management also increases. Unlike model organisms in which considerable scientific resources are used to experimentally validate genotype-phenotype relationships, endangered species typically lack the necessary sample sizes and economic resources required for such studies. Even so, in the absence of experimentally verified genetic discoveries, genomics data still provides value. In fact, bioinformatics and comparative genomics approaches offer mechanisms for translating these raw genomics data sets into integrated knowledge that enable an informed approach to endangered species management.

  16. Delineating Facies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics with Level Set Transformation.

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn Edward; Song, Xuehang; Ye, Ming; Dai, Zhenxue; Zachara, John; Chen, Xingyuan

    2017-03-01

    A new approach is developed to delineate the spatial distribution of discrete facies (geological units that have unique distributions of hydraulic, physical, and/or chemical properties) conditioned not only on direct data (measurements directly related to facies properties, e.g., grain size distribution obtained from borehole samples) but also on indirect data (observations indirectly related to facies distribution, e.g., hydraulic head and tracer concentration). Our method integrates for the first time ensemble data assimilation with traditional transition probability-based geostatistics. The concept of level set is introduced to build shape parameterization that allows transformation between discrete facies indicators and continuous random variables. The spatial structure of different facies is simulated by indicator models using conditioning points selected adaptively during the iterative process of data assimilation. To evaluate the new method, a two-dimensional semi-synthetic example is designed to estimate the spatial distribution and permeability of two distinct facies from transient head data induced by pumping tests. The example demonstrates that our new method adequately captures the spatial pattern of facies distribution by imposing spatial continuity through conditioning points. The new method also reproduces the overall response in hydraulic head field with better accuracy compared to data assimilation with no constraints on spatial continuity on facies.

  17. Integrating Genomic Data Sets for Knowledge Discovery: An Informed Approach to Management of Captive Endangered Species.

    Science.gov (United States)

    Irizarry, Kristopher J L; Bryant, Doug; Kalish, Jordan; Eng, Curtis; Schmidt, Peggy L; Barrett, Gini; Barr, Margaret C

    2016-01-01

    Many endangered captive populations exhibit reduced genetic diversity resulting in health issues that impact reproductive fitness and quality of life. Numerous cost effective genomic sequencing and genotyping technologies provide unparalleled opportunity for incorporating genomics knowledge in management of endangered species. Genomic data, such as sequence data, transcriptome data, and genotyping data, provide critical information about a captive population that, when leveraged correctly, can be utilized to maximize population genetic variation while simultaneously reducing unintended introduction or propagation of undesirable phenotypes. Current approaches aimed at managing endangered captive populations utilize species survival plans (SSPs) that rely upon mean kinship estimates to maximize genetic diversity while simultaneously avoiding artificial selection in the breeding program. However, as genomic resources increase for each endangered species, the potential knowledge available for management also increases. Unlike model organisms in which considerable scientific resources are used to experimentally validate genotype-phenotype relationships, endangered species typically lack the necessary sample sizes and economic resources required for such studies. Even so, in the absence of experimentally verified genetic discoveries, genomics data still provides value. In fact, bioinformatics and comparative genomics approaches offer mechanisms for translating these raw genomics data sets into integrated knowledge that enable an informed approach to endangered species management.

  18. Evaluating the Outcomes of Transactional Analysis and Integrative Counselling Psychology within UK Primary Care Settings

    Directory of Open Access Journals (Sweden)

    Biljana van Rijn

    2011-07-01

    Full Text Available The paper reports on a naturalistic study that replicated the evaluative design associated with the UK National Health Service initiative IAPT − Improving Access to Psychological Therapies (CSIP 2008, NHS 2011, as previously used to assess Cognitive Behavioural Therapy (CBT, with the aim of evaluating 12-session treatments for anxiety and depression, applying Transactional Analysis and Integrative Counselling Psychology approaches within real clinical settings in primary care. Standard outcome measures were used in line with the IAPT model (CORE 10 and 34, GAD-7, PHQ-9, supplemented with measurement of the working alliance (WAI Horvath 1986 and an additional depression inventory BDI-II (Beck, 1996, and ad-herence to the therapeutic model using newly designed questionnaires. Results indicated that severity of problems was reduced using either approach, comparative to Cognitive Behavioural Therapy; that initial severity was predictive of outcome; and that working alliance increased as therapy progressed but was not directly related to outcomes. Adherence was high for both approaches. Several areas for enhance-ments to future research are suggested.

  19. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    Science.gov (United States)

    Kerstman, Eric L.; Minard, Charles; FreiredeCarvalho, Mary H.; Walton, Marlei E.; Myers, Jerry G., Jr.; Saile, Lynn G.; Lopez, Vilma; Butler, Douglas J.; Johnson-Throop, Kathy A.

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) and its use as a risk assessment and decision support tool for human space flight missions. The IMM is an integrated, quantified, evidence-based decision support tool useful to NASA crew health and mission planners. It is intended to assist in optimizing crew health, safety and mission success within the constraints of the space flight environment for in-flight operations. It uses ISS data to assist in planning for the Exploration Program and it is not intended to assist in post flight research. The IMM was used to update Probability Risk Assessment (PRA) for the purpose of updating forecasts for the conditions requiring evacuation (EVAC) or Loss of Crew Life (LOC) for the ISS. The IMM validation approach includes comparison with actual events and involves both qualitative and quantitaive approaches. The results of these comparisons are reviewed. Another use of the IMM is to optimize the medical kits taking into consideration the specific mission and the crew profile. An example of the use of the IMM to optimize the medical kits is reviewed.

  20. INTEGRATING CORPUS-BASED RESOURCES AND NATURAL LANGUAGE PROCESSING TOOLS INTO CALL

    Directory of Open Access Journals (Sweden)

    Pascual Cantos Gomez

    2002-06-01

    Full Text Available This paper ainis at presenting a survey of computational linguistic tools presently available but whose potential has been neither fully considered not exploited to its full in modern CALL. It starts with a discussion on the rationale of DDL to language learning, presenting typical DDL-activities. DDL-software and potential extensions of non-typical DDL-software (electronic dictionaries and electronic dictionary facilities to DDL . An extended section is devoted to describe NLP-technology and how it can be integrated into CALL, within already existing software or as stand alone resources. A range of NLP-tools is presentcd (MT programs, taggers, lemn~atizersp, arsers and speech technologies with special emphasis on tagged concordancing. The paper finishes with a number of reflections and ideas on how language technologies can be used efficiently within the language learning context and how extensive exploration and integration of these technologies might change and extend both modern CAI,I, and the present language learning paradigiii..

  1. Development of an integrated knowledge-base and its management tool for computerized alarm processing system

    International Nuclear Information System (INIS)

    Heo, Gyun Young; Choi, Seong Soo; Kim, Han Gon; Chang, Soon Heung

    1997-01-01

    For a long time, a number of alarm processing techniques have been researched to reduce the number of actuated alarms for operators to deal with effectively during the abnormal as well as the normal conditions. However, the strategy that the only systems with a few clear technologies should be used as a part of an alarm annunciation system has been adopted considering the effectiveness and the reliability in actual alarm processing systems. Therefore, alarm processing systems have difficult knowledge-base maintenance problems and limited expansion or enhancement defects. To solve these shortcomings, the integrated knowledge-base which can express the general information related to all the alarm processing techniques is proposed and its management tool, Knowledge Input Tool for Alarm (KIT-A) which can handle the data of the knowledge-base efficiently is developed. Since the integrated knowledge-base with KIT-A can manipulate all the alarm information without the modification of alarm processing system itself, it is expected to considerably advance the overall capability of maintenance and enhancement of the alarm processing systems

  2. Towards an integrated petrophysical tool for multiphase flow properties of core samples

    Energy Technology Data Exchange (ETDEWEB)

    Lenormand, R. [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    This paper describes the first use of an Integrated Petrophysical Tool (IPT) on reservoir rock samples. The IPT simultaneously measures the following petrophysical properties: (1) Complete capillary pressure cycle: primary drainage, spontaneous and forced imbibitions, secondary drainage (the cycle leads to the wettability of the core by using the USBM index); End-points and parts of the relative permeability curves; Formation factor and resistivity index. The IPT is based on the steady-state injection of one fluid through the sample placed in a Hassler cell. The experiment leading to the whole Pc cycle on two reservoir sandstones consists of about 30 steps at various oil or water flow rates. It takes about four weeks and is operated at room conditions. Relative permeabilities are in line with standard steady-state measurements. Capillary pressures are in accordance with standard centrifuge measurements. There is no comparison for the resistivity index, but the results are in agreement with literature data. However, the accurate determination of saturation remains the main difficulty and some improvements are proposed. In conclusion, the Integrated Petrophysical Tool is as accurate as standard methods and has the advantage of providing the various parameters on the same sample and during a single experiment. The FIT is easy to use and can be automated. In addition, it can be operated in reservoir conditions.

  3. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    Science.gov (United States)

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  4. BIG: a large-scale data integration tool for renal physiology.

    Science.gov (United States)

    Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya; Knepper, Mark A

    2016-10-01

    Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: "How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?" This is the type of problem that has motivated the "Big-Data" revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/.

  5. ACE-it: a tool for genome-wide integration of gene dosage and RNA expression data

    NARCIS (Netherlands)

    van Wieringen, W.N.; Belien, J.A.M.; Vosse, S.; Achame, E.M.; Ylstra, B.

    2006-01-01

    Summary: We describe a tool, called ACE-it (Array CGH Expression integration tool). ACE-it links the chromosomal position of the gene dosage measured by array CGH to the genes measured by the expression array. ACE-it uses this link to statistically test whether gene dosage affects RNA expression. ©

  6. Optically Driven Mobile Integrated Micro-Tools for a Lab-on-a-Chip

    Directory of Open Access Journals (Sweden)

    Yi-Jui Liu

    2013-04-01

    Full Text Available This study proposes an optically driven complex micromachine with an Archimedes microscrew as the mechanical power, a sphere as a coupler, and three knives as the mechanical tools. The micromachine is fabricated by two-photon polymerization and is portably driven by optical tweezers. Because the microscrew can be optically trapped and rotates spontaneously, it provides driving power for the complex micro-tools. In other words, when a laser beam focuses on the micromachine, the microscrew is trapped toward the focus point and simultaneously rotates. A demonstration showed that the integrated micromachines are grasped by the optical tweezers and rotated by the Archimedes screw. The rotation efficiencies of the microrotors with and without knives are 1.9 rpm/mW and 13.5 rpm/mW, respectively. The micromachine can also be portably dragged along planed routes. Such Archimedes screw-based optically driven complex mechanical micro-tools enable rotation similar to moving machines or mixers, which could contribute to applications for a biological microfluidic chip or a lab-on-a-chip.

  7. Application of the NCSA Habanero tool for collaboration on structural integrity assessments

    International Nuclear Information System (INIS)

    Bass, B.R.; Kruse, K.; Dodds, R.H. Jr.; Malik, S.N.M.

    1998-11-01

    The Habanero software was developed by the National Center for Superconducting Applications at the University of Illinois, Urbana-Champaign, as a framework for the collaborative sharing of Java applications. The Habanero tool performs distributed communication of single-user, computer software interactions to a multiuser collaborative environment. An investigation was conducted to evaluate the capabilities of the Habanero tool in providing an Internet-based collaborative framework for researchers located at different sites and operating on different workstations. These collaborative sessions focused on the sharing of test data and analysis results from materials engineering areas (i.e., fracture mechanics and structural integrity evaluations) related to reactor pressure vessel safety research sponsored by the US Nuclear Regulatory Commission. This report defines collaborative-system requirements for engineering applications and provides an overview of collaborative systems within the project. The installation, application, and detailed evaluation of the performance of the Habanero collaborative tool are compared to those of another commercially available collaborative product. Recommendations are given for future work in collaborative communications

  8. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  9. Tools and measures for stimulation the efficient energy consumption. Integrated resource planning in Romania

    International Nuclear Information System (INIS)

    Scripcariu, Daniela; Scripcariu, Mircea; Leca, Aureliu

    1996-01-01

    The integrated resource planning is based on analyses of the energy generation and energy consumption as a whole. Thus, increasing the energy efficiency appears to be the cheapest, the most available and the most cost-effective energy resource. In order to stimulate the increase of efficiency of energy consumption, besides economic efficiency criteria for selecting technical solutions, additional tools and measures are necessary. The paper presents the main tools and measures needed to foster an efficient energy consumption. Actions meant to stimulate DSM (Demand-Side Management) implementation in Romania are proposed. The paper contains 5 sections. In the introduction, the main aspects of the DSM are considered, namely, where the programs are implemented, who is the responsible, which are the objectives and finally, how the DSM programs are implemented. The following tools in management of energy use are examined: the energy prices, the regulation in the field of energy efficiency, standards and norms, energy labelling of the products and energy education. Among the measures for managing the energy use, the paper takes into consideration the institutions responsible for DSM, for instance, the Romanian Agency for Energy Conservation (ARCE), decentralization of decision making, the program approaches and financing the actions aiming at improving the energy efficiency. Finally, the paper analyses the criteria in choosing adequate solutions of improving the energy efficiency

  10. Climate Change Adaptation Tools at the Community Level: An Integrated Literature Review

    Directory of Open Access Journals (Sweden)

    Elvis Modikela Nkoana

    2018-03-01

    Full Text Available The negative impacts of climate change are experienced at the global, regional and local levels. However, rural communities in sub-Saharan Africa face additional socio-political, cultural and economic challenges in addition to climate change. Decision support tools have been developed and applied to assist rural communities to cope with and adapt to climate change. However, poorly planned participatory processes and the lack of context-specific approaches in these tools are obstacles when aiming at strengthening the resilience of these rural communities. This paper uses an integrated literature review to identify best practices for involving rural communities in climate change adaptation efforts through the application of context-specific and culturally-sensitive climate change adaptation tools. These best practices include the use of a livelihoods approach to engage communities; the explicit acknowledgement of the local cultural do’s and don’ts; the recognition of local champions appointed from within the local community; the identification and prioritisation of vulnerable stakeholders; and the implementation of a two-way climate change risk communication instead of a one-sided information sharing approach.

  11. The e-Reader — an Educational or an Entertainment Tool? e-Readers in an Academic Setting

    Directory of Open Access Journals (Sweden)

    Peter Ahlroos

    2012-01-01

    Full Text Available In this paper the authors will discuss a pilot project conducted at the Tritonia Academic Library, Vaasa, in Finland, from September 2010 until May 2011. The project was designed to investigate the application of e-readers in academic settings and to learn how teachers and students experience the use of e-readers in academic education. Four groups of students and one group of teachers used Kindle readers for varied periods of time in different courses. The course material and the textbooks were downloaded on the e-readers. The feedback from the participants was collected through questionnaires and teacher interviews. The results suggest that the e-reader is a future tool for learning, though some features need to be improved before e-readers can really enable efficient learning and researching.

  12. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    International Nuclear Information System (INIS)

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-01

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  13. Integrity of clay till aquitards to DNAPL migration: Assessment using current and emerging characterization tools

    DEFF Research Database (Denmark)

    Fjordbøge, Annika Sidelmann; Janniche, Gry Sander; Jørgensen, Torben H.

    2017-01-01

    Field investigations were carried out to determine the occurrence of tetrachloroethene (PCE) dense non-aqueous phase liquid (DNAPL), the source zone architecture and the aquitard integrity at a 30-50 year old DNAPL release site. The DNAPL source zone is located in the clay till unit overlying......% of the total PCE mass. The data set, and associated data analysis, supported vertical migration of DNAPL through fractures in the upper part of the clay till, horizontal migration along high permeability features around the redox boundary in the clay till, and to some extent vertical migration through...... the fractures in the reduced part of the clay till aquitard to the underlying limestone aquifer. The aquitard integrity to DNAPL migration was found to be compromised at a thickness of reduced clay till of less than 2 m....

  14. Web-Enabled Mechanistic Case Diagramming: A Novel Tool for Assessing Students' Ability to Integrate Foundational and Clinical Sciences.

    Science.gov (United States)

    Ferguson, Kristi J; Kreiter, Clarence D; Haugen, Thomas H; Dee, Fred R

    2018-02-20

    As medical schools move from discipline-based courses to more integrated approaches, identifying assessment tools that parallel this change is an important goal. The authors describe the use of test item statistics to assess the reliability and validity of web-enabled mechanistic case diagrams (MCDs) as a potential tool to assess students' ability to integrate basic science and clinical information. Students review a narrative clinical case and construct an MCD using items provided by the case author. Students identify the relationships among underlying risk factors, etiology, pathogenesis and pathophysiology, and the patients' signs and symptoms. They receive one point for each correctly-identified link. In 2014-15 and 2015-16, case diagrams were implemented in consecutive classes of 150 medical students. The alpha reliability coefficient for the overall score, constructed using each student's mean proportion correct across all cases, was 0.82. Discrimination indices for each of the case scores with the overall score ranged from 0.23 to 0.51. In a G study using those students with complete data (n = 251) on all 16 cases, 10% of the variance was true score variance, and systematic case variance was large. Using 16 cases generated a G coefficient (relative score reliability) equal to .72 and a Phi equal to .65. The next phase of the project will involve deploying MCDs in higher-stakes settings to determine whether similar results can be achieved. Further analyses will determine whether these assessments correlate with other measures of higher-order thinking skills.

  15. Using a blog as an integrated eLearning tool and platform.

    Science.gov (United States)

    Goh, Poh Sun

    2016-06-01

    Technology enhanced learning or eLearning allows educators to expand access to educational content, promotes engagement with students and makes it easier for students to access educational material at a time, place and pace which suits them. The challenge for educators beginning their eLearning journey is to decide where to start, which includes the choice of an eLearning tool and platform. This article will share one educator's decision making process, and experience using blogs as a flexible and versatile integrated eLearning tool and platform. Apart from being a cost effective/free tool and platform, blogs offer the possibility of creating a hyperlinked indexed content repository, for both created and curated educational material; as well as a distribution and engagement tool and platform. Incorporating pedagogically sound activities and educational practices into a blog promote a structured templated teaching process, which can be reproduced. Moving from undergraduate to postgraduate training, educational blogs supported by a comprehensive online case-based repository offer the possibility of training beyond competency towards proficiency and expert level performance through a process of deliberate practice. By documenting educational content and the student engagement and learning process, as well as feedback and personal reflection of educational sessions, blogs can also form the basis for a teaching portfolio, and provide evidence and data of scholarly teaching and educational scholarship. Looking into the future, having a collection of readily accessible indexed hyperlinked teaching material offers the potential to do on the spot teaching with illustrative material called up onto smart surfaces, and displayed on holographic interfaces.

  16. Wheat Rust Information Resources - Integrated tools and data for improved decision making

    DEFF Research Database (Denmark)

    Hodson, David; Hansen, Jens Grønbech; Lassen, Poul

    giving access to an unprecedented set of data for rust surveys, alternate hosts (barberry), rust pathotypes, trap nurseries and resistant cultivars. Standardized protocols for data collection have permitted the development of a comprehensive data management system, named the Wheat Rust Toolbox....... Integration of the CIMMYT Wheat Atlas and the Genetic Resources Information System (GRIS) databases provides a rich resource on wheat cultivars and their resistance to important rust races. Data access is facilitated via dedicated web portals such as Rust Tracker (www.rusttracker.org) and the Global Rust...

  17. CaliBayes and BASIS: integrated tools for the calibration, simulation and storage of biological simulation models.

    Science.gov (United States)

    Chen, Yuhui; Lawless, Conor; Gillespie, Colin S; Wu, Jake; Boys, Richard J; Wilkinson, Darren J

    2010-05-01

    Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing sub-cellular molecular interactions, especially when critical molecular species are thought to be present at low copy-numbers. For example, these stochastic effects play an important role in models of human ageing, where ageing results from the long-term accumulation of random damage at various biological scales. Unfortunately, realistic stochastic simulation of discrete biological processes is highly computationally intensive, requiring specialist hardware, and can benefit greatly from parallel and distributed approaches to computation and analysis. For these reasons, we have developed the BASIS system for the simulation and storage of stochastic SBML models together with associated simulation results. This system is exposed as a set of web services to allow users to incorporate its simulation tools into their workflows. Parameter inference for stochastic models is also difficult and computationally expensive. The CaliBayes system provides a set of web services (together with an R package for consuming these and formatting data) which addresses this problem for SBML models. It uses a sequential Bayesian MCMC method, which is powerful and flexible, providing very rich information. However this approach is exceptionally computationally intensive and requires the use of a carefully designed architecture. Again, these tools are exposed as web services to allow users to take advantage of this system. In this article, we describe these two systems and demonstrate their integrated use with an example workflow to estimate the parameters of a simple model of Saccharomyces cerevisiae growth on agar plates.

  18. Integrating hypermedia into the environmental education setting: Developing a program and evaluating its effect

    Science.gov (United States)

    Parker, Tehri Davenport

    1997-09-01

    This study designed, implemented, and evaluated an environmental education hypermedia program for use in a residential environmental education facility. The purpose of the study was to ascertain whether a hypermedia program could increase student knowledge and positive attitudes toward the environment and environmental education. A student/computer interface, based on the theory of social cognition, was developed to direct student interactions with the computer. A quasi-experimental research design was used. Students were randomly assigned to either the experimental or control group. The experimental group used the hypermedia program to learn about the topic of energy. The control group received the same conceptual information from a teacher/naturalist. An Environmental Awareness Quiz was administered to measure differences in the students' cognitive understanding of energy issues. Students participated in one on one interviews to discuss their attitudes toward the lesson and the overall environmental education experience. Additionally, members of the experimental group were tape recorded while they used the hypermedia program. These tapes were analyzed to identify aspects of the hypermedia program that promoted student learning. The findings of this study suggest that computers, and hypermedia programs, can be integrated into residential environmental education facilities, and can assist environmental educators in meeting their goals for students. The study found that the hypermedia program was as effective as the teacher/naturalist for teaching about environmental education material. Students who used the computer reported more positive attitudes toward the lesson on energy, and thought that they had learned more than the control group. Students in the control group stated that they did not learn as much as the computer group. The majority of students had positive attitudes toward the inclusion of computers in the camp setting, and stated that they were a good

  19. Initial validation of the prekindergarten Classroom Observation Tool and goal setting system for data-based coaching.

    Science.gov (United States)

    Crawford, April D; Zucker, Tricia A; Williams, Jeffrey M; Bhavsar, Vibhuti; Landry, Susan H

    2013-12-01

    Although coaching is a popular approach for enhancing the quality of Tier 1 instruction, limited research has addressed observational measures specifically designed to focus coaching on evidence-based practices. This study explains the development of the prekindergarten (pre-k) Classroom Observation Tool (COT) designed for use in a data-based coaching model. We examined psychometric characteristics of the COT and explored how coaches and teachers used the COT goal-setting system. The study included 193 coaches working with 3,909 pre-k teachers in a statewide professional development program. Classrooms served 3 and 4 year olds (n = 56,390) enrolled mostly in Title I, Head Start, and other need-based pre-k programs. Coaches used the COT during a 2-hr observation at the beginning of the academic year. Teachers collected progress-monitoring data on children's language, literacy, and math outcomes three times during the year. Results indicated a theoretically supported eight-factor structure of the COT across language, literacy, and math instructional domains. Overall interrater reliability among coaches was good (.75). Although correlations with an established teacher observation measure were small, significant positive relations between COT scores and children's literacy outcomes indicate promising predictive validity. Patterns of goal-setting behaviors indicate teachers and coaches set an average of 43.17 goals during the academic year, and coaches reported that 80.62% of goals were met. Both coaches and teachers reported the COT was a helpful measure for enhancing quality of Tier 1 instruction. Limitations of the current study and implications for research and data-based coaching efforts are discussed. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  20. Implementing and measuring the level of laboratory service integration in a program setting in Nigeria.

    Directory of Open Access Journals (Sweden)

    Henry Mbah

    Full Text Available The surge of donor funds to fight HIV&AIDS epidemic inadvertently resulted in the setup of laboratories as parallel structures to rapidly respond to the identified need. However these parallel structures are a threat to the existing fragile laboratory systems. Laboratory service integration is critical to remedy this situation. This paper describes an approach to quantitatively measure and track integration of HIV-related laboratory services into the mainstream laboratory services and highlight some key intervention steps taken, to enhance service integration.A quantitative before-and-after study conducted in 122 Family Health International (FHI360 supported health facilities across Nigeria. A minimum service package was identified including management structure; trainings; equipment utilization and maintenance; information, commodity and quality management for laboratory integration. A check list was used to assess facilities at baseline and 3 months follow-up. Level of integration was assessed on an ordinal scale (0 = no integration, 1 = partial integration, 2 = full integration for each service package. A composite score grading expressed as a percentage of total obtainable score of 14 was defined and used to classify facilities (≤ 80% FULL, 25% to 79% PARTIAL and <25% NO integration. Weaknesses were noted and addressed.We analyzed 9 (7.4% primary, 104 (85.2% secondary and 9 (7.4% tertiary level facilities. There were statistically significant differences in integration levels between baseline and 3 months follow-up period (p<0.01. Baseline median total integration score was 4 (IQR 3 to 5 compared to 7 (IQR 4 to 9 at 3 months follow-up (p = 0.000. Partial and fully integrated laboratory systems were 64 (52.5% and 0 (0.0% at baseline, compared to 100 (82.0% and 3 (2.4% respectively at 3 months follow-up (p = 0.000.This project showcases our novel approach to measure the status of each laboratory on the integration continuum.

  1. Do nurses provide a safe sleep environment for infants in the hospital setting? An integrative review.

    Science.gov (United States)

    Patton, Carla; Stiltner, Denise; Wright, Kelly Barnhardt; Kautz, Donald D

    2015-02-01

    Sudden infant death syndrome (SIDS) may be the most preventable cause of death for infants 0 to 6 months of age. The American Academy of Pediatrics (AAP) first published safe sleep recommendations for parents and healthcare professionals in 1992. In 1994, new guidelines were published and they became known as the "Back to Sleep" campaign. After this, a noticeable decline occurred in infant deaths from SIDS. However, this number seems to have plateaued with no continuing significant improvements in infant deaths. The objective of this review was to determine whether nurses provide a safe sleep environment for infants in the hospital setting. Research studies that dealt with nursing behaviors and nursing knowledge in the hospital setting were included in the review. A search was conducted of Google Scholar, CINAHL, PubMed, and Cochrane, using the key words "NICU," "newborn," "SIDS," "safe sleep environment," "nurse," "education," "supine sleep," "prone sleep," "safe sleep," "special care nursery," "hospital policy for safe sleep," "research," "premature," "knowledge," "practice," "health care professionals," and "parents." The review included research reports on nursing knowledge and behaviors as well as parental knowledge obtained through education and role modeling of nursing staff. Only research studies were included to ensure that our analysis was based on rigorous research-based findings. Several international studies were included because they mirrored findings noted in the United States. All studies were published between 1999 and 2012. Healthcare professionals and parents were included in the studies. They were primarily self-report surveys, designed to determine what nurses, other healthcare professionals, and parents knew or had been taught about SIDS. Integrative review. Thirteen of the 16 studies included in the review found that some nurses and some mothers continued to use nonsupine positioning. Four of the 16 studies discussed nursing knowledge and

  2. Impact of electronic medical record integration of a handoff tool on sign-out in a newborn intensive care unit

    Science.gov (United States)

    Palma, JP; Sharek, PJ; Longhurst, CA

    2016-01-01

    Objective To evaluate the impact of integrating a handoff tool into the electronic medical record (EMR) on sign-out accuracy, satisfaction and workflow in a neonatal intensive care unit (NICU). Study Design Prospective surveys of neonatal care providers in an academic children’s hospital 1 month before and 6 months following EMR integration of a standalone Microsoft Access neonatal handoff tool. Result Providers perceived sign-out information to be somewhat or very accurate at a rate of 78% with the standalone handoff tool and 91% with the EMR-integrated tool (P < 0.01). Before integration of neonatal sign-out into the EMR, 35% of providers were satisfied with the process of updating sign-out information and 71% were satisfied with the printed sign-out document; following EMR integration, 92% of providers were satisfied with the process of updating sign-out information (P < 0.01) and 98% were satisfied with the printed sign-out document (P < 0.01). Neonatal care providers reported spending a median of 11 to 15 min/day updating the standalone sign-out and 16 to 20 min/day updating the EMR-integrated sign-out (P = 0.026). The median percentage of total sign-out preparation time dedicated to transcribing information from the EMR was 25 to 49% before and <25% after EMR integration of the handoff tool (P < 0.01). Conclusion Integration of a NICU-specific handoff tool into an EMR resulted in improvements in perceived sign-out accuracy, provider satisfaction and at least one aspect of workflow. PMID:21273990

  3. WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora

    Directory of Open Access Journals (Sweden)

    Miguel Callón

    2017-12-01

    Full Text Available In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background.

  4. A model of integration among prediction tools: applied study to road freight transportation

    Directory of Open Access Journals (Sweden)

    Henrique Dias Blois

    Full Text Available Abstract This study has developed a scenery analysis model which has integrated decision-making tools on investments: prospective scenarios (Grumbach Method and systems dynamics (hard modeling, with the innovated multivariate analysis of experts. It was designed through analysis and simulation scenarios and showed which are the most striking events in the study object as well as highlighted the actions could redirect the future of the analyzed system. Moreover, predictions are likely to be developed through the generated scenarios. The model has been validated empirically with road freight transport data from state of Rio Grande do Sul, Brazil. The results showed that the model contributes to the analysis of investment because it identifies probabilities of events that impact on decision making, and identifies priorities for action, reducing uncertainties in the future. Moreover, it allows an interdisciplinary discussion that correlates different areas of knowledge, fundamental when you wish more consistency in creating scenarios.

  5. WARCProcessor: An Integrative Tool for Building and Management of Web Spam Corpora.

    Science.gov (United States)

    Callón, Miguel; Fdez-Glez, Jorge; Ruano-Ordás, David; Laza, Rosalía; Pavón, Reyes; Fdez-Riverola, Florentino; Méndez, Jose Ramón

    2017-12-22

    In this work we present the design and implementation of WARCProcessor, a novel multiplatform integrative tool aimed to build scientific datasets to facilitate experimentation in web spam research. The developed application allows the user to specify multiple criteria that change the way in which new corpora are generated whilst reducing the number of repetitive and error prone tasks related with existing corpus maintenance. For this goal, WARCProcessor supports up to six commonly used data sources for web spam research, being able to store output corpus in standard WARC format together with complementary metadata files. Additionally, the application facilitates the automatic and concurrent download of web sites from Internet, giving the possibility of configuring the deep of the links to be followed as well as the behaviour when redirected URLs appear. WARCProcessor supports both an interactive GUI interface and a command line utility for being executed in background.

  6. Implementing and measuring the level of laboratory service integration in a program setting in Nigeria.

    Science.gov (United States)

    Mbah, Henry; Negedu-Momoh, Olubunmi Ruth; Adedokun, Oluwasanmi; Ikani, Patrick Anibbe; Balogun, Oluseyi; Sanwo, Olusola; Ochei, Kingsley; Ekanem, Maurice; Torpey, Kwasi

    2014-01-01

    The surge of donor funds to fight HIV&AIDS epidemic inadvertently resulted in the setup of laboratories as parallel structures to rapidly respond to the identified need. However these parallel structures are a threat to the existing fragile laboratory systems. Laboratory service integration is critical to remedy this situation. This paper describes an approach to quantitatively measure and track integration of HIV-related laboratory services into the mainstream laboratory services and highlight some key intervention steps taken, to enhance service integration. A quantitative before-and-after study conducted in 122 Family Health International (FHI360) supported health facilities across Nigeria. A minimum service package was identified including management structure; trainings; equipment utilization and maintenance; information, commodity and quality management for laboratory integration. A check list was used to assess facilities at baseline and 3 months follow-up. Level of integration was assessed on an ordinal scale (0 = no integration, 1 = partial integration, 2 = full integration) for each service package. A composite score grading expressed as a percentage of total obtainable score of 14 was defined and used to classify facilities (≤ 80% FULL, 25% to 79% PARTIAL and laboratory systems were 64 (52.5%) and 0 (0.0%) at baseline, compared to 100 (82.0%) and 3 (2.4%) respectively at 3 months follow-up (p = 0.000). This project showcases our novel approach to measure the status of each laboratory on the integration continuum.

  7. Bringing it All Together: NODC's Geoportal Server as an Integration Tool for Interoperable Data Services

    Science.gov (United States)

    Casey, K. S.; Li, Y.

    2011-12-01

    The US National Oceanographic Data Center (NODC) has implemented numerous interoperable data technologies in recent years to enhance the discovery, understanding, and use of the vast quantities of data in the NODC archives. These services include OPeNDAP's Hyrax server, Unidata's THREDDS Data Server (TDS), NOAA's Live Access Server (LAS), and most recently the ESRI ArcGIS Server. Combined, these technologies enable NODC to provide access to its data holdings and products through most of the commonly-used standardized web services like the Data Access Protocol (DAP) and the Open Geospatial Consortium suite of services such as the Web Mapping Service (WMS) and Web Coverage Service (WCS). Despite the strong demand for and use of these services, the acronym-rich environment of services can also result in confusion for producers of data to the NODC archives, for consumers of data from the NODC archives, and for the data stewards at the archives as well. The situation is further complicated by the fact that NODC also maintains some ad hoc services like WODselect, and that not all services can be applied to all of the tens of thousands of collections in the NODC archive; where once every data set was available only through FTP and HTTP servers, now many are also available from the LAS, TDS, Hyrax, and ArcGIS Server. To bring order and clarity to this potentially confusing collection of services, NODC deployed the Geoportal Server into its Archive Management System as an integrating technology that brings together its various data access, visualization, and discovery services as well as its overall metadata management workflows. While providing an enhanced web-based interface for more integrated human-to-machine discovery and access, the deployment also enables NODC for the first time to support a robust set of machine-to-machine discovery services such as the Catalog Service for the Web (CS/W), OpenSearch, and Search and Retrieval via URL (SRU) . This approach allows NODC

  8. Integrating research tools to support the management of social-ecological systems under climate change

    Science.gov (United States)

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  9. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Space Flight Medical Systems

    Science.gov (United States)

    Kerstman, Eric; Minard, Charles; Saile, Lynn; deCarvalho, Mary Freire; Myers, Jerry; Walton, Marlei; Butler, Douglas; Iyengar, Sriram; Johnson-Throop, Kathy; Baumann, David

    2009-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to mission planners and medical system designers in assessing risks and designing medical systems for space flight missions. The IMM provides an evidence based approach for optimizing medical resources and minimizing risks within space flight operational constraints. The mathematical relationships among mission and crew profiles, medical condition incidence data, in-flight medical resources, potential crew functional impairments, and clinical end-states are established to determine probable mission outcomes. Stochastic computational methods are used to forecast probability distributions of crew health and medical resource utilization, as well as estimates of medical evacuation and loss of crew life. The IMM has been used in support of the International Space Station (ISS) medical kit redesign, the medical component of the ISS Probabilistic Risk Assessment, and the development of the Constellation Medical Conditions List. The IMM also will be used to refine medical requirements for the Constellation program. The IMM outputs for ISS and Constellation design reference missions will be presented to demonstrate the potential of the IMM in assessing risks, planning missions, and designing medical systems. The implementation of the IMM verification and validation plan will be reviewed. Additional planned capabilities of the IMM, including optimization techniques and the inclusion of a mission timeline, will be discussed. Given the space flight constraints of mass, volume, and crew medical training, the IMM is a valuable risk assessment and decision support tool for medical system design and mission planning.

  10. Catalyst synthesis and evaluation using an integrated atomic layer deposition synthesis–catalysis testing tool

    International Nuclear Information System (INIS)

    Camacho-Bunquin, Jeffrey; Shou, Heng; Marshall, Christopher L.; Aich, Payoli; Beaulieu, David R.; Klotzsch, Helmut; Bachman, Stephen; Hock, Adam; Stair, Peter

    2015-01-01

    An integrated atomic layer deposition synthesis-catalysis (I-ALD-CAT) tool was developed. It combines an ALD manifold in-line with a plug-flow reactor system for the synthesis of supported catalytic materials by ALD and immediate evaluation of catalyst reactivity using gas-phase probe reactions. The I-ALD-CAT delivery system consists of 12 different metal ALD precursor channels, 4 oxidizing or reducing agents, and 4 catalytic reaction feeds to either of the two plug-flow reactors. The system can employ reactor pressures and temperatures in the range of 10 −3 to 1 bar and 300–1000 K, respectively. The instrument is also equipped with a gas chromatograph and a mass spectrometer unit for the detection and quantification of volatile species from ALD and catalytic reactions. In this report, we demonstrate the use of the I-ALD-CAT tool for the synthesis of platinum active sites and Al 2 O 3 overcoats, and evaluation of catalyst propylene hydrogenation activity

  11. Investigation of potential integration of spectroradiometer data with GIS technology: The Spectro-GIS tools

    International Nuclear Information System (INIS)

    Salleh, S A; Hamid, J R A; Ariffin, I M

    2014-01-01

    The Earth's surface consists of different ground cover types. The spectral signature of these ground cover targets is unique and can be determined in the field through quantitative measurement of radiance and reflectance response by portable spectroradiometers. In this study, a field portable spectroradiometer, the GER 1500, covering the Ultraviolet, Visible and Near-infrared wavelengths from 350 nm to 1050 nm was used to record the spectral response reading of different ground cover types. The measurements were made at the time when the Sun was at several instant positions to find out the influences and impacts on the spectroradiometer observations. These instant positions of the Sun were determined via spherical computation. The outcome from the measurements made against selected target features by the spectroradiometer is an output file containing signature plot data that was generated in .sig and/or ASCII format. The attempt of the study was to convert that spectroradiometer data into a GIS-enable format. The development of a Spectro-GIS tool was customized using Visual Basic. Net programming language that enables the tools to run independently and automate the process of the conversion and generation of spectral library of the surface targets is highlighted. The results of this study will be benefited to the earth observation community in a way of providing alternative automation of spatial data archiving as well as the data integration and fusion of the land spectral signatures

  12. Catalyst synthesis and evaluation using an integrated atomic layer deposition synthesis–catalysis testing tool

    Energy Technology Data Exchange (ETDEWEB)

    Camacho-Bunquin, Jeffrey; Shou, Heng; Marshall, Christopher L. [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Aich, Payoli [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Department of Chemical Engineering, University of Illinois at Chicago, Chicago, Illinois 60607 (United States); Beaulieu, David R.; Klotzsch, Helmut; Bachman, Stephen [Arradiance Inc., Sudbury, Massachusetts 01776 (United States); Hock, Adam [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Department of Chemistry, Illinois Institute of Technology, Chicago, Illinois 60616 (United States); Stair, Peter [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Department of Chemistry, Northwestern University, Evanston, Illinois 60208 (United States)

    2015-08-15

    An integrated atomic layer deposition synthesis-catalysis (I-ALD-CAT) tool was developed. It combines an ALD manifold in-line with a plug-flow reactor system for the synthesis of supported catalytic materials by ALD and immediate evaluation of catalyst reactivity using gas-phase probe reactions. The I-ALD-CAT delivery system consists of 12 different metal ALD precursor channels, 4 oxidizing or reducing agents, and 4 catalytic reaction feeds to either of the two plug-flow reactors. The system can employ reactor pressures and temperatures in the range of 10{sup −3} to 1 bar and 300–1000 K, respectively. The instrument is also equipped with a gas chromatograph and a mass spectrometer unit for the detection and quantification of volatile species from ALD and catalytic reactions. In this report, we demonstrate the use of the I-ALD-CAT tool for the synthesis of platinum active sites and Al{sub 2}O{sub 3} overcoats, and evaluation of catalyst propylene hydrogenation activity.

  13. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    International Nuclear Information System (INIS)

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  14. Integrating Web-Based Teaching Tools into Large University Physics Courses

    Science.gov (United States)

    Toback, David; Mershin, Andreas; Novikova, Irina

    2005-12-01

    Teaching students in our large, introductory, calculus-based physics courses to be good problem-solvers is a difficult task. Not only must students be taught to understand and use the physics concepts in a problem, they must become adept at turning the physical quantities into symbolic variables, translating the problem into equations, and "turning the crank" on the mathematics to find both a closed-form solution and a numerical answer. Physics education research has shown that students' poor math skills and instructors' lack of pen-and-paper homework grading resources, two problems we face at our institution, can have a significant impact on problem-solving skill development.2-4 While Interactive Engagement methods appear to be the preferred mode of instruction,5 for practical reasons we have not been able to widely implement them. In this paper, we describe three Internet-based "teaching-while-quizzing" tools we have developed and how they have been integrated into our traditional lecture course in powerful but easy to incorporate ways.6 These are designed to remediate students' math deficiencies, automate homework grading, and guide study time toward problem solving. Our intent is for instructors who face similar obstacles to adopt these tools, which are available upon request.7

  15. INTEGRATION OF COST MODELS AND PROCESS SIMULATION TOOLS FOR OPTIMUM COMPOSITE MANUFACTURING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Pack, Seongchan [General Motors; Wilson, Daniel [General Motors; Aitharaju, Venkat [General Motors; Kia, Hamid [General Motors; Yu, Hang [ESI, Group.; Doroudian, Mark [ESI Group

    2017-09-05

    Manufacturing cost of resin transfer molded composite parts is significantly influenced by the cycle time, which is strongly related to the time for both filling and curing of the resin in the mold. The time for filling can be optimized by various injection strategies, and by suitably reducing the length of the resin flow distance during the injection. The curing time can be reduced by the usage of faster curing resins, but it requires a high pressure injection equipment, which is capital intensive. Predictive manufacturing simulation tools that are being developed recently for composite materials are able to provide various scenarios of processing conditions virtually well in advance of manufacturing the parts. In the present study, we integrate the cost models with process simulation tools to study the influence of various parameters such as injection strategies, injection pressure, compression control to minimize high pressure injection, resin curing rate, and demold time on the manufacturing cost as affected by the annual part volume. A representative automotive component was selected for the study and the results are presented in this paper

  16. Integrative analysis of survival-associated gene sets in breast cancer.

    Science.gov (United States)

    Varn, Frederick S; Ung, Matthew H; Lou, Shao Ke; Cheng, Chao

    2015-03-12

    Patient gene expression information has recently become a clinical feature used to evaluate breast cancer prognosis. The emergence of prognostic gene sets that take advantage of these data has led to a rich library of information that can be used to characterize the molecular nature of a patient's cancer. Identifying robust gene sets that are consistently predictive of a patient's clinical outcome has become one of the main challenges in the field. We inputted our previously established BASE algorithm with patient gene expression data and gene sets from MSigDB to develop the gene set activity score (GSAS), a metric that quantitatively assesses a gene set's activity level in a given patient. We utilized this metric, along with patient time-to-event data, to perform survival analyses to identify the gene sets that were significantly correlated with patient survival. We then performed cross-dataset analyses to identify robust prognostic gene sets and to classify patients by metastasis status. Additionally, we created a gene set network based on component gene overlap to explore the relationship between gene sets derived from MSigDB. We developed a novel gene set based on this network's topology and applied the GSAS metric to characterize its role in patient survival. Using the GSAS metric, we identified 120 gene sets that were significantly associated with patient survival in all datasets tested. The gene overlap network analysis yielded a novel gene set enriched in genes shared by the robustly predictive gene sets. This gene set was highly correlated to patient survival when used alone. Most interestingly, removal of the genes in this gene set from the gene pool on MSigDB resulted in a large reduction in the number of predictive gene sets, suggesting a prominent role for these genes in breast cancer progression. The GSAS metric provided a useful medium by which we systematically investigated how gene sets from MSigDB relate to breast cancer patient survival. We used

  17. A browser-based 3D Visualization Tool designed for comparing CERES/CALIOP/CloudSAT level-2 data sets.

    Science.gov (United States)

    Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Doelling, D. R.

    2017-12-01

    In Langley NASA, Clouds and the Earth's Radiant Energy System (CERES) and Moderate Resolution Imaging Spectroradiometer (MODIS) are merged with Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and CloudSat Cloud Profiling Radar (CPR). The CERES merged product (C3M) matches up to three CALIPSO footprints with each MODIS pixel along its ground track. It then assigns the nearest CloudSat footprint to each of those MODIS pixels. The cloud properties from MODIS, retrieved using the CERES algorithms, are included in C3M with the matched CALIPSO and CloudSat products along with radiances from 18 MODIS channels. The dataset is used to validate the CERES retrieved MODIS cloud properties and the computed TOA and surface flux difference using MODIS or CALIOP/CloudSAT retrieved clouds. This information is then used to tune the computed fluxes to match the CERES observed TOA flux. A visualization tool will be invaluable to determine the cause of these large cloud and flux differences in order to improve the methodology. This effort is part of larger effort to allow users to order the CERES C3M product sub-setted by time and parameter as well as the previously mentioned visualization capabilities. This presentation will show a new graphical 3D-interface, 3D-CERESVis, that allows users to view both passive remote sensing satellites (MODIS and CERES) and active satellites (CALIPSO and CloudSat), such that the detailed vertical structures of cloud properties from CALIPSO and CloudSat are displayed side by side with horizontally retrieved cloud properties from MODIS and CERES. Similarly, the CERES computed profile fluxes whether using MODIS or CALIPSO and CloudSat clouds can also be compared. 3D-CERESVis is a browser-based visualization tool that makes uses of techniques such as multiple synchronized cursors, COLLADA format data and Cesium.

  18. PROSPER: an integrated feature-based tool for predicting protease substrate cleavage sites.

    Directory of Open Access Journals (Sweden)

    Jiangning Song

    Full Text Available The ability to catalytically cleave protein substrates after synthesis is fundamental for all forms of life. Accordingly, site-specific proteolysis is one of the most important post-translational modifications. The key to understanding the physiological role of a protease is to identify its natural substrate(s. Knowledge of the substrate specificity of a protease can dramatically improve our ability to predict its target protein substrates, but this information must be utilized in an effective manner in order to efficiently identify protein substrates by in silico approaches. To address this problem, we present PROSPER, an integrated feature-based server for in silico identification of protease substrates and their cleavage sites for twenty-four different proteases. PROSPER utilizes established specificity information for these proteases (derived from the MEROPS database with a machine learning approach to predict protease cleavage sites by using different, but complementary sequence and structure characteristics. Features used by PROSPER include local amino acid sequence profile, predicted secondary structure, solvent accessibility and predicted native disorder. Thus, for proteases with known amino acid specificity, PROSPER provides a convenient, pre-prepared tool for use in identifying protein substrates for the enzymes. Systematic prediction analysis for the twenty-four proteases thus far included in the database revealed that the features we have included in the tool strongly improve performance in terms of cleavage site prediction, as evidenced by their contribution to performance improvement in terms of identifying known cleavage sites in substrates for these enzymes. In comparison with two state-of-the-art prediction tools, PoPS and SitePrediction, PROSPER achieves greater accuracy and coverage. To our knowledge, PROSPER is the first comprehensive server capable of predicting cleavage sites of multiple proteases within a single substrate

  19. Speech recognition by means of a three-integrated-circuit set

    Energy Technology Data Exchange (ETDEWEB)

    Zoicas, A.

    1983-11-03

    The author uses pattern recognition methods for detecting word boundaries, and monitors incoming speech at 12 millisecond intervals. Frequency is divided into eight bands and analysis is achieved in an analogue interface integrated circuit, a pipeline digital processor and a control integrated circuit. Applications are suggested, including speech input to personal computers. 3 references.

  20. Integrating Information Services in an Academic Setting: The Organizational and Technical Challenge.

    Science.gov (United States)

    Branin, Joseph J.; And Others

    1993-01-01

    Describes a project to integrate the support and delivery of information services to faculty and staff at the University of Minnesota from the planning phase to implementation of a new organizational entity. Topics addressed include technical and organizational integration, control and delivery of services, and networking and organizational fit.…

  1. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    Science.gov (United States)

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  2. Evaluating patient care communication in integrated care settings: application of a mixed method approach in cerebral palsy programs

    NARCIS (Netherlands)

    Gulmans, J.; Gulmans, J.; Vollenbroek-Hutten, Miriam Marie Rosé; van Gemert-Pijnen, Julia E.W.C.; van Harten, Willem H.

    2009-01-01

    Objective. In this study, we evaluated patient care communication in the integrated care setting of children with cerebral palsy in three Dutch regions in order to identify relevant communication gaps experienced by both parents and involved professionals. - Design. A three-step mixed method

  3. Impact of Vicarious Learning Experiences and Goal Setting on Preservice Teachers' Self-Efficacy for Technology Integration: A Pilot Study.

    Science.gov (United States)

    Wang, Ling; Ertmer, Peggy A.

    This pilot study was designed to explore how vicarious learning experiences and goal setting influence preservice teachers' self-efficacy for integrating technology into the classroom. Twenty undergraduate students who were enrolled in an introductory educational technology course at a large midwestern university participated and were assigned…

  4. An Integrative Review of In-Class Activities That Enable Active Learning in College Science Classroom Settings

    Science.gov (United States)

    Arthurs, Leilani A.; Kreager, Bailey Zo

    2017-01-01

    Engaging students in active learning is linked to positive learning outcomes. This study aims to synthesise the peer-reviewed literature about "active learning" in college science classroom settings. Using the methodology of an integrative literature review, 337 articles archived in the Educational Resources Information Center (ERIC) are…

  5. Simulation of Logging-while-drilling Tool Response Using Integral Equation Fast Fourier Transform

    Directory of Open Access Journals (Sweden)

    Sun Xiang-Yang

    2017-01-01

    Full Text Available We rely on the volume integral equation (VIE method for the simulation of loggingwhile- drilling (LWG tool response using the integral equation fast Fourier transform (IE-FFT algorithm to accelerate the computation of the matrix-vector product in the iterative solver. Depending on the virtue of the Toeplitz structure of the interpolation of the Green’s function on the uniform Cartesian grids, this method uses FFT to calculate the matrix-vector multiplication. At the same time, this method reduce the memory requirement and CPU time. In this paper, IEFFT method is first used in the simulation of LWG. Numerical results are presented to demonstrate the accuracy and efficiency of this method. Compared with the Moment of Method (MOM and other fast algorithms, IE-FFT have distinct advantages in the fact of memory requirement and CPU time. In addition, this paper study the truncation, mesh elements, the size of the interpolation grids of IE-FFT and dip formation, and give some conclusion with wide applicability.

  6. An Assessment Tool to Integrate Sustainability Principles into the Global Supply Chain

    Directory of Open Access Journals (Sweden)

    María Jesús Muñoz-Torres

    2018-02-01

    Full Text Available The integration of sustainability principles into the assessment of companies along the supply chains is a growing research area. However, there is an absence of a generally accepted method to evaluate corporate sustainability performance (CSP, and the models and frameworks proposed by the literature present various important challenges to be addressed. A systematic literature review on the supply chain at the corporate level has been conducted, analyzing the main strengths and gaps in the sustainability assessment literature. Therefore, this paper aims to contribute to the development of this field by proposing an assessment framework a leading company can adopt to expand sustainability principles to the rest of the members of the supply chain. This proposal is based on best practices and integrates and shares efforts with key initiatives (for instance, the Organizational Environmental Footprint from the European Commission and United Nations Environment Programme and the Society of Environmental Toxicology and Chemistry UNEP/SETAC; moreover, it overcomes important limitations of the current sustainability tools in a supply chain context consistent with the circular economy, the Sustainable Development Goals (SDGs, planetary boundaries, and social foundation requirements. The results obtained create, on the one hand, new opportunities for academics; and, on the other hand, in further research, the use of this framework could be a means of actively engaging companies in their supply chains and of achieving the implementation of practical and comprehensive CSP assessment.

  7. Opportunites for Integrated Landscape Planning – the Broker, the Arena, the Tool

    Directory of Open Access Journals (Sweden)

    Julia Carlsson

    2017-12-01

    Full Text Available As an integrated social and ecological system, the forest landscape includes multiple values. The need for a landscape pproach in land use planning is being increasingly advocated in research, policy and practice. This paper explores how institutional conditions in the forest policy and management sector can be developed to meet demands for a multifunctional landscape perspective. Departing from obstacles recognised in collaborative planning literature, we build an analytical framework which is operationalised in a Swedish context at municipal level. Our case illustrating this is Vilhelmina Model Forest, where actual barriers and opportunities for a multiple-value landscape approach are identified through 32 semi-structured interviews displaying stakeholders’ views on forest values,ownership rights and willingness to consider multiple values, forest policy and management premises, and collaboration. As an opportunity to overcome the barriers, we suggest and discuss three key components by which an integrated landscape planning approach could be realized in forest management planning: the need for a landscape coordinator (broker, the need for a collaborative forum (arena, and the development of the existing forest management plan into an advanced multifunctional landscape plan (tool.

  8. Thinking Critically about Critical Thinking: Integrating Online Tools to Promote Critical Thinking

    Directory of Open Access Journals (Sweden)

    B. Jean Mandernach

    2006-01-01

    Full Text Available The value and importance of critical thinking is clearly established; the challenge for instructors lies in successfully promoting students’ critical thinking skills within the confines of a traditional classroom experience. Since instructors are faced with limited student contact time to meet their instructional objectives and facilitate learning, they are often forced to make instructional decisions between content coverage, depth of understanding, and critical analysis of course material. To address this dilemma, it is essential to integrate instructional strategies and techniques that can efficiently and effectively maximize student learning and critical thinking. Modern advances in educational technology have produced a range of online tools to assist instructors in meeting this instructional goal. This review will examine the theoretical foundations of critical thinking in higher education, discuss empirically-based strategies for integrating online instructional supplements to enhance critical thinking, offer techniques for expanding instructional opportunities outside the limitations of traditional class time, and provide practical suggestions for the innovative use of critical thinking strategies via online resources.

  9. Tools and methods for integrated resource planning. Improving energy efficiency and protecting the environment

    International Nuclear Information System (INIS)

    Swisher, J.N.; Martino Jannuzzi, G. de; Redlinger, R.Y.

    1997-01-01

    This book resulted from our recognition of the need to have systematic teaching and training materials on energy efficiency, end-use analysis, demand-side management (DSM) and integrated resource planning (IRP). This book addresses energy efficiency programs and IRP, exploring their application in the electricity sector. We believe that these methods will provide powerful and practical tools for designing efficient and environmentally-sustainable energy supply and demand-side programs to minimize the economic, environmental and other social costs of electricity conversion and use. Moreover, the principles of IRP can be and already are being applied in other areas such as natural gas, water supply, and even transportation and health services. Public authorities can use IRP principles to design programs to encourage end-use efficiency and environmental protection through environmental charges and incentives, non-utility programs, and utility programs applied to the functions remaining in monopoly concessions such as the distribution wires. Competitive supply firms can use IRP principles to satisfy customer needs for efficiency and low prices, to comply with present and future environmental restrictions, and to optimize supply and demand-side investments and returns, particularly at the distribution level, where local-area IRP is now being actively practiced. Finally, in those countries where a strong planning function remains in place, IRP provides a way to integrate end-use efficiency and environmental protection into energy development. (EG) 181 refs

  10. Tools and methods for integrated resource planning. Improving energy efficiency and protecting the environment

    Energy Technology Data Exchange (ETDEWEB)

    Swisher, J N; Martino Jannuzzi, G de; Redlinger, R Y

    1997-11-01

    This book resulted from our recognition of the need to have systematic teaching and training materials on energy efficiency, end-use analysis, demand-side management (DSM) and integrated resource planning (IRP). This book addresses energy efficiency programs and IRP, exploring their application in the electricity sector. We believe that these methods will provide powerful and practical tools for designing efficient and environmentally-sustainable energy supply and demand-side programs to minimize the economic, environmental and other social costs of electricity conversion and use. Moreover, the principles of IRP can be and already are being applied in other areas such as natural gas, water supply, and even transportation and health services. Public authorities can use IRP principles to design programs to encourage end-use efficiency and environmental protection through environmental charges and incentives, non-utility programs, and utility programs applied to the functions remaining in monopoly concessions such as the distribution wires. Competitive supply firms can use IRP principles to satisfy customer needs for efficiency and low prices, to comply with present and future environmental restrictions, and to optimize supply and demand-side investments and returns, particularly at the distribution level, where local-area IRP is now being actively practiced. Finally, in those countries where a strong planning function remains in place, IRP provides a way to integrate end-use efficiency and environmental protection into energy development. (EG) 181 refs.

  11. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  12. A review of the potential for competitive cereal cultivars as a tool in integrated weed management.

    Science.gov (United States)

    Andrew, I K S; Storkey, J; Sparkes, D L

    2015-06-01

    Competitive crop cultivars offer a potentially cheap option to include in integrated weed management strategies (IWM). Although cultivars with high competitive potential have been identified amongst cereal crops, competitiveness has not traditionally been considered a priority for breeding or farmer cultivar choice. The challenge of managing herbicide-resistant weed populations has, however, renewed interest in cultural weed control options, including competitive cultivars. We evaluated the current understanding of the traits that explain variability in competitive ability between cultivars, the relationship between suppression of weed neighbours and tolerance of their presence and the existence of trade-offs between competitive ability and yield in weed-free scenarios. A large number of relationships between competitive ability and plant traits have been reported in the literature, including plant height, speed of development, canopy architecture and partitioning of resources. There is uncertainty over the relationship between suppressive ability and tolerance, although tolerance is a less stable trait over seasons and locations. To realise the potential of competitive crop cultivars as a tool in IWM, a quick and simple-to-use protocol for assessing the competitive potential of new cultivars is required; it is likely that this will not be based on a single trait, but will need to capture the combined effect of multiple traits. A way needs to be found to make this information accessible to farmers, so that competitive cultivars can be better integrated into their weed control programmes.

  13. Microencapsulation Technology: A Powerful Tool for Integrating Expansion and Cryopreservation of Human Embryonic Stem Cells

    Science.gov (United States)

    Malpique, Rita; Brito, Catarina; Jensen, Janne; Bjorquist, Petter; Carrondo, Manuel J. T.; Alves, Paula M.

    2011-01-01

    The successful implementation of human embryonic stem cells (hESCs)-based technologies requires the production of relevant numbers of well-characterized cells and their efficient long-term storage. In this study, cells were microencapsulated in alginate to develop an integrated bioprocess for expansion and cryopreservation of pluripotent hESCs. Different three-dimensional (3D) culture strategies were evaluated and compared, specifically, microencapsulation of hESCs as: i) single cells, ii) aggregates and iii) immobilized on microcarriers. In order to establish a scalable bioprocess, hESC-microcapsules were cultured in stirred tank bioreactors. The combination of microencapsulation and microcarrier technology resulted in a highly efficient protocol for the production and storage of pluripotent hESCs. This strategy ensured high expansion ratios (an approximately twenty-fold increase in cell concentration) and high cell recovery yields (>70%) after cryopreservation. When compared with non-encapsulated cells, cell survival post-thawing demonstrated a three-fold improvement without compromising hESC characteristics. Microencapsulation also improved the culture of hESC aggregates by protecting cells from hydrodynamic shear stress, controlling aggregate size and maintaining cell pluripotency for two weeks. This work establishes that microencapsulation technology may prove a powerful tool for integrating the expansion and cryopreservation of pluripotent hESCs. The 3D culture strategy developed herein represents a significant breakthrough towards the implementation of hESCs in clinical and industrial applications. PMID:21850261

  14. oPOSSUM: integrated tools for analysis of regulatory motif over-representation

    Science.gov (United States)

    Ho Sui, Shannan J.; Fulton, Debra L.; Arenillas, David J.; Kwon, Andrew T.; Wasserman, Wyeth W.

    2007-01-01

    The identification of over-represented transcription factor binding sites from sets of co-expressed genes provides insights into the mechanisms of regulation for diverse biological contexts. oPOSSUM, an internet-based system for such studies of regulation, has been improved and expanded in this new release. New features include a worm-specific version for investigating binding sites conserved between Caenorhabditis elegans and C. briggsae, as well as a yeast-specific version for the analysis of co-expressed sets of Saccharomyces cerevisiae genes. The human and mouse applications feature improvements in ortholog mapping, sequence alignments and the delineation of multiple alternative promoters. oPOSSUM2, introduced for the analysis of over-represented combinations of motifs in human and mouse genes, has been integrated with the original oPOSSUM system. Analysis using user-defined background gene sets is now supported. The transcription factor binding site models have been updated to include new profiles from the JASPAR database. oPOSSUM is available at http://www.cisreg.ca/oPOSSUM/ PMID:17576675

  15. Effect of cutting fluids and cutting conditions on surface integrity and tool wear in turning of Inconel 713C

    Science.gov (United States)

    Hikiji, R.

    2018-01-01

    The trend toward downsizing of engines helps to increase the number of turbochargers around Europe. As for the turbocharger, the temperature of the exhaust gas is so high that the parts made of nickel base super alloy Inconel 713C are used as high temperature strength metals. External turning of Inconel 713C which is used as the actual automotive parts was carried out. The effect of the cutting fluids and cutting conditions on the surface integrity and tool wear was investigated, considering global environment and cost performance. As a result, in the range of the cutting conditions used this time, when the depth of cut was small, the good surface integrity and tool life were obtained. However, in the case of the large corner radius, it was found that the more the cutting length increased, the more the tool wear increased. When the cutting length is so large, the surface integrity and tool life got worse. As for the cutting fluids, it was found that the synthetic type showed better performance in the surface integrity and tool life than the conventional emulsion. However, it was clear that the large corner radius made the surface roughness and tool life good, but it affected the size error etc. in machining the workpiece held in a cantilever style.

  16. Identification of similar regions of protein structures using integrated sequence and structure analysis tools

    Directory of Open Access Journals (Sweden)

    Heiland Randy

    2006-03-01

    Full Text Available Abstract Background Understanding protein function from its structure is a challenging problem. Sequence based approaches for finding homology have broad use for annotation of both structure and function. 3D structural information of protein domains and their interactions provide a complementary view to structure function relationships to sequence information. We have developed a web site http://www.sblest.org/ and an API of web services that enables users to submit protein structures and identify statistically significant neighbors and the underlying structural environments that make that match using a suite of sequence and structure analysis tools. To do this, we have integrated S-BLEST, PSI-BLAST and HMMer based superfamily predictions to give a unique integrated view to prediction of SCOP superfamilies, EC number, and GO term, as well as identification of the protein structural environments that are associated with that prediction. Additionally, we have extended UCSF Chimera and PyMOL to support our web services, so that users can characterize their own proteins of interest. Results Users are able to submit their own queries or use a structure already in the PDB. Currently the databases that a user can query include the popular structural datasets ASTRAL 40 v1.69, ASTRAL 95 v1.69, CLUSTER50, CLUSTER70 and CLUSTER90 and PDBSELECT25. The results can be downloaded directly from the site and include function prediction, analysis of the most conserved environments and automated annotation of query proteins. These results reflect both the hits found with PSI-BLAST, HMMer and with S-BLEST. We have evaluated how well annotation transfer can be performed on SCOP ID's, Gene Ontology (GO ID's and EC Numbers. The method is very efficient and totally automated, generally taking around fifteen minutes for a 400 residue protein. Conclusion With structural genomics initiatives determining structures with little, if any, functional characterization

  17. Single Molecule Analysis Research Tool (SMART: an integrated approach for analyzing single molecule data.

    Directory of Open Access Journals (Sweden)

    Max Greenfeld

    Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.

  18. Development of an integrated e-health tool for people with, or at high risk of, cardiovascular disease: The Consumer Navigation of Electronic Cardiovascular Tools (CONNECT) web application.

    Science.gov (United States)

    Neubeck, Lis; Coorey, Genevieve; Peiris, David; Mulley, John; Heeley, Emma; Hersch, Fred; Redfern, Julie

    2016-12-01

    Cardiovascular disease is the leading killer globally and secondary prevention substantially reduces risk. Uptake of, and adherence to, face-to-face preventive programs is often low. Alternative models of care are exploiting the prominence of technology in daily life to facilitate lifestyle behavior change. To inform the development of a web-based application integrated with the primary care electronic health record, we undertook a collaborative user-centered design process to develop a consumer-focused e-health tool for cardiovascular disease risk reduction. A four-phase iterative process involved ten multidisciplinary clinicians and academics (primary care physician, nurses and allied health professionals), two design consultants, one graphic designer, three software developers and fourteen proposed end-users. This 18-month process involved, (1) defining the target audience and needs, (2) pilot testing and refinement, (3) software development including validation and testing the algorithm, (4) user acceptance testing and beta testing. From this process, researchers were able to better understand end-user needs and preferences, thereby improving and enriching the increasingly detailed system designs and prototypes for a mobile responsive web application. We reviewed 14 relevant applications/websites and sixteen observational and interventional studies to derive a set of core components and ideal features for the system. These included the need for interactivity, visual appeal, credible health information, virtual rewards, and emotional and physical support. The features identified as essential were: (i) both mobile and web-enabled 'apps', (ii) an emphasis on medication management, (iii) a strong psychosocial support component. Subsequent workshops (n=6; 2×1.5h) informed the development of functionality and lo-fidelity sketches of application interfaces. These ideas were next tested in consumer focus groups (n=9; 3×1.5h). Specifications for the application were

  19. APMS: An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)

    1997-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  20. The Aviation Performance Measuring System (APMS): An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Connor, Mary M. (Technical Monitor)

    1998-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the aircrew.

  1. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    Energy Technology Data Exchange (ETDEWEB)

    HAYENGA, J.L.

    2006-12-19

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements.

  2. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    International Nuclear Information System (INIS)

    HAYENGA, J.L.

    2006-01-01

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements

  3. Integration of resilience capabilities for Critical Infrastructures into the Emergency Management set-up

    DEFF Research Database (Denmark)

    Kozine, Igor; Andersen, Henning Boje

    2015-01-01

    We suggest an approach for maintaining and enhancing resilience that integrates the resilience capabilities of Critical Infrastructures (CIs) into the emergency management cycle (prevention, preparedness, response, and recovery). This allows emergency services to explicitly address resilience...

  4. EasyCloneMulti: A Set of Vectors for Simultaneous and Multiple Genomic Integrations in Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Maury, Jerome; Germann, Susanne Manuela; Jacobsen, Simo Abdessamad

    2016-01-01

    Saccharomyces cerevisiae is widely used in the biotechnology industry for production of ethanol, recombinant proteins, food ingredients and other chemicals. In order to generate highly producing and stable strains, genome integration of genes encoding metabolic pathway enzymes is the preferred...... of integrative vectors, EasyCloneMulti, that enables multiple and simultaneous integration of genes in S. cerevisiae. By creating vector backbones that combine consensus sequences that aim at targeting subsets of Ty sequences and a quickly degrading selective marker, integrations at multiple genomic loci...... and a range of expression levels were obtained, as assessed with the green fluorescent protein (GFP) reporter system. The EasyCloneMulti vector set was applied to balance the expression of the rate-controlling step in the β-alanine pathway for biosynthesis of 3-hydroxypropionic acid (3HP). The best 3HP...

  5. New tools for integrative thermochronology, and their application to the Colombian Eastern Cordillera

    Science.gov (United States)

    Ketcham, R. A.; Mora, A.; Almendral, A.; Parra-Amezquita, M.; Casallas, W.; Robles, W.

    2013-12-01

    We present two new tools for interpreting thermochronometric data that facilitate the joint use of multiple samples to better constrain thermal history, and demonstrate their utilization in the Colombian Eastern Cordillera. The first, Fetkin, is a finite element solver that takes as input a series of detailed balanced cross sections created using dedicated software such as (2D)Move, and solves the heat flow equation in 2D along with predicted thermochronometric ages which can be compared against measured data. It also performs an independent analysis of the cross sections and flags aspects that are structurally out of balance. It is distinguished from similar tools in 2D and 3D principally by providing a level of detail that allows for investigation of samples in very specific and complex structural contexts, and a workflow that allows the interpreter to engage in successive refinements of the structural model using the inferences provided by thermochronometric data. The second tool is a new set of functionality in HeFTy for inverse modeling of thermochronometric data that allows for simultaneous modeling of samples down a well or borehole. This extension forces attention on issues that have previously been relatively neglected in such modeling, in particular that of multiple provenance. It is axiomatic that mineral grains in different strata may have come from different regions and have different inherited thermal histories. Interpreting such data in a realistic geological context thus requires allowing for different inherited populations within and between samples. The rewards in doing so include more robust modeling and interpretation and, in some cases, insights concerning the unroofing histories of the source rocks that contributed to a given sedimentary unit. Similarly, the mutual constraints imposed by modeling multiple samples with known or constrained depositional and structural context considerably amplifies the resolving power of thermochronometric data

  6. Next-generation phage display: integrating and comparing available molecular tools to enable cost-effective high-throughput analysis.

    Directory of Open Access Journals (Sweden)

    Emmanuel Dias-Neto

    2009-12-01

    Full Text Available Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i the counting of transducing units and (ii the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges.We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU, with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs approximately 250-fold for generating 10(6 ligand sequences.Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall, qPhage plus pyrosequencing is superior to TU-counting plus Sanger

  7. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    Science.gov (United States)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  8. Qualification of integrated tool environments (QUITE) for the development of computer-based safety systems in NPP

    International Nuclear Information System (INIS)

    Miedl, Horst

    2004-01-01

    In NPP I et C systems are back fitted meanwhile increasingly by computer-based systems (I et C platforms). The corresponding safety functions are implemented by software, and this software is developed, configured and administrated with the help of integrated tool environments (ITE). An ITE offers a set of services which are used to construct an I et C system and consist typically of software packages for project control and documentation, specification and design, automatic code generation and so on. Commercial ITE are not necessarily conceived and qualified (type-tested) for nuclear specific applications but are used - and will increasingly be used - for the implementation of nuclear safety related I et C systems. Therefor, it is necessary to qualify commercial ITE with respect to their influence on the quality of the target system for each I et C platform (dependent on the safety category of the target system). Examples for commercial ITEs are I et C platforms like SPINLINE 3, TELEPERM XP, Common Q, TRICON, etc. (Author)

  9. The Integrated Waste Tracking Systems (IWTS) - A Comprehensive Waste Management Tool

    International Nuclear Information System (INIS)

    Robert S. Anderson

    2005-01-01

    The US Department of Energy (DOE) Idaho National Laboratory (INL) site located near Idaho Falls, ID USA, has developed a comprehensive waste management and tracking tool that integrates multiple operational activities with characterization data from waste declaration through final waste disposition. The Integrated Waste Tracking System (IWTS) provides information necessary to help facility personnel properly manage their waste and demonstrate a wide range of legal and regulatory compliance. As a client?server database system, the IWTS is a proven tracking, characterization, compliance, and reporting tool that meets the needs of both operations and management while providing a high level of flexibility. This paper describes some of the history involved with the development and current use of IWTS as a comprehensive waste management tool as well as a discussion of IWTS deployments performed by the INL for outside clients. Waste management spans a wide range of activities including: work group interactions, regulatory compliance management, reporting, procedure management, and similar activities. The IWTS documents these activities and performs tasks in a computer-automated environment. Waste characterization data, container characterization data, shipments, waste processing, disposals, reporting, and limit compliance checks are just a few of the items that IWTS documents and performs to help waste management personnel perform their jobs. Throughout most hazardous and radioactive waste generating, storage and disposal sites, waste management is performed by many different groups of people in many facilities. Several organizations administer their areas of waste management using their own procedures and documentation independent of other organizations. Files are kept, some of which are treated as quality records, others not as stringent. Quality records maintain a history of: changes performed after approval, the reason for the change(s), and a record of whom and when

  10. The Integrated Waste Tracking Systems (IWTS) - A Comprehensive Waste Management Tool

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Anderson

    2005-09-01

    The US Department of Energy (DOE) Idaho National Laboratory (INL) site located near Idaho Falls, ID USA, has developed a comprehensive waste management and tracking tool that integrates multiple operational activities with characterization data from waste declaration through final waste disposition. The Integrated Waste Tracking System (IWTS) provides information necessary to help facility personnel properly manage their waste and demonstrate a wide range of legal and regulatory compliance. As a client?server database system, the IWTS is a proven tracking, characterization, compliance, and reporting tool that meets the needs of both operations and management while providing a high level of flexibility. This paper describes some of the history involved with the development and current use of IWTS as a comprehensive waste management tool as well as a discussion of IWTS deployments performed by the INL for outside clients. Waste management spans a wide range of activities including: work group interactions, regulatory compliance management, reporting, procedure management, and similar activities. The IWTS documents these activities and performs tasks in a computer-automated environment. Waste characterization data, container characterization data, shipments, waste processing, disposals, reporting, and limit compliance checks are just a few of the items that IWTS documents and performs to help waste management personnel perform their jobs. Throughout most hazardous and radioactive waste generating, storage and disposal sites, waste management is performed by many different groups of people in many facilities. Several organizations administer their areas of waste management using their own procedures and documentation independent of other organizations. Files are kept, some of which are treated as quality records, others not as stringent. Quality records maintain a history of: changes performed after approval, the reason for the change(s), and a record of whom and when

  11. Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models

    Science.gov (United States)

    Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.

    2017-12-01

    Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision

  12. COMSY - A software tool for PLIM + PLEX with integrated risk-informed approaches

    International Nuclear Information System (INIS)

    Zander, A.; Nopper, H.; Roessner, R.

    2004-01-01

    The majority of mechanical components and structures in a thermal power plant are designed to experience a service life which is far above the intended design life. In most cases, only a small percentage of mechanical components are subject to significant degradation which may affect the integrity or the function of the component. If plant life extension (PLEX) is considered as an option, a plant specific PLIM strategy needs to be developed. One of the most important tasks of such a PLIM strategy is to identify those components which (i) are relevant for the safety and/or availability of the plant and (ii) experience elevated degradation due to their operating and design conditions. For these components special life management strategies need to be established to reliably monitor their condition. FRAMATOME ANP GmbH has developed the software tool COMSY, which is designed to efficiently support a plant-wide lifetime management strategy for static mechanical components, providing the basis for plant life extension (PLEX) activities. The objective is the economical and safe operation of power plants over their design lifetime - and beyond. The tool provides the capability to establish a program guided technical documentation of the plant by utilizing a virtual plant data model. The software integrates engineering analysis functions and comprehensive material libraries to perform a lifetime analysis for various degradation mechanisms typically experienced in power plants (e.g. flow-accelerated corrosion, intergranular stress corrosion cracking, strain-induced cracking, material fatigue, cavitation erosion, droplet impingement erosion, pitting, etc.). A risk-based prioritization serves to focus inspection activities on safety or availability relevant locations, where a degradation potential exists. Trending functions support the comparison of the as-measured condition with the predicted progress of degradation while making allowance for measurement tolerances. The

  13. Setting value optimization method in integration for relay protection based on improved quantum particle swarm optimization algorithm

    Science.gov (United States)

    Yang, Guo Sheng; Wang, Xiao Yang; Li, Xue Dong

    2018-03-01

    With the establishment of the integrated model of relay protection and the scale of the power system expanding, the global setting and optimization of relay protection is an extremely difficult task. This paper presents a kind of application in relay protection of global optimization improved particle swarm optimization algorithm and the inverse time current protection as an example, selecting reliability of the relay protection, selectivity, quick action and flexibility as the four requires to establish the optimization targets, and optimizing protection setting values of the whole system. Finally, in the case of actual power system, the optimized setting value results of the proposed method in this paper are compared with the particle swarm algorithm. The results show that the improved quantum particle swarm optimization algorithm has strong search ability, good robustness, and it is suitable for optimizing setting value in the relay protection of the whole power system.

  14. IT Tools for Foresight: The Integrated Insight and Response System of Deutsche Telekom Innovation Laboratories

    DEFF Research Database (Denmark)

    Rohrbeck, René; Thom, Nico; Arnold, Heinrich

    2015-01-01

    . The overall system consists of a tool for scanning for weak signals on change (PEACOQ Scouting Tool), a tool for collecting internal ideas (PEACOQ Gate 0.5), and a tool for triggering organizational responses (Foresight Landing page). Particularly the link to innovation management and R&D strategy...

  15. Changes in language development among autistic and peer children in segregated and integrated preschool settings.

    Science.gov (United States)

    Harris, S L; Handleman, J S; Kristoff, B; Bass, L; Gordon, R

    1990-03-01

    Five young children with autism enrolled in a segregated class, five other children with autism in an integrated class, and four normally developing peer children in the integrated class were compared for developmental changes in language ability as measured by the Preschool Language Scale before and after training. The results, based on Mann-Whitney U tests, showed that (a) all of the children as a group made better than normative progress in rate of language development, (b) the scores of the autistic children were significantly lower than the peers before and after treatment, and (c) there were no significant differences in changes in language ability between the autistic children in the segregated and integrated classes.

  16. An integrated simulation tool for analyzing the Operation and Interdependency of Natural Gas and Electric Power Systems

    OpenAIRE

    PAMBOUR Kwabena A.; CAKIR BURCIN; BOLADO LAVIN Ricardo; DIJKEMA Gerard

    2016-01-01

    In this paper, we present an integrated simulation tool for analyzing the interdependency of natural gas and electric power systems in terms of security of energy supply. In the first part, we develop mathematical models for the individual systems. In part two, we identify the interconnections between both systems and propose a method for coupling the combined simulation model. Next, we develop the algorithm for solving the combined system and integrate this algorithm into a simulation softwa...

  17. Exploring Challenges and Opportunities of Coproduction: USDA Climate Hub Efforts to Integrate Coproduction with Applied Research and Decision Support Tool Development in the Northwest

    Science.gov (United States)

    Roesch-McNally, G.; Prendeville, H. R.

    2017-12-01

    A lack of coproduction, the joint production of new technologies or knowledge among technical experts and other groups, is arguably one of the reasons why much scientific information and resulting decision support systems are not very usable. Increasingly, public agencies and academic institutions are emphasizing the importance of coproduction of scientific knowledge and decision support systems in order to facilitate greater engagement between the scientific community and key stakeholder groups. Coproduction has been embraced as a way for the scientific community to develop actionable scientific information that will assist end users in solving real-world problems. Increasing the level of engagement and stakeholder buy-in to the scientific process is increasingly necessary, particularly in the context of growing politicization of science and the scientific process. Coproduction can be an effective way to build trust and can build-on and integrate local and traditional knowledge. Employing coproduction strategies may enable the development of more relevant and useful information and decision support tools that address stakeholder challenges at relevant scales. The USDA Northwest Climate Hub has increasingly sought ways to integrate coproduction in the development of both applied research projects and the development of decision support systems. Integrating coproduction, however, within existing institutions is not always simple, given that coproduction is often more focused on process than products and products are, for better or worse, often the primary focus of applied research and tool development projects. The USDA Northwest Climate Hub sought to integrate coproduction into our FY2017 call for proposal process. As a result we have a set of proposals and fledgling projects that fall along the engagement continuum (see Figure 1- attached). We will share the challenges and opportunities that emerged from this purposeful integration of coproduction into the work

  18. Image Navigation and Registration Performance Assessment Tool Set for the GOES-R Advanced Baseline Imager and Geostationary Lightning Mapper

    Science.gov (United States)

    De Luccia, Frank J.; Houchin, Scott; Porter, Brian C.; Graybill, Justin; Haas, Evan; Johnson, Patrick D.; Isaacson, Peter J.; Reth, Alan D.

    2016-01-01

    The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. For ABI, these metrics are the 3-sigma errors in navigation (NAV), channel-to-channel registration (CCR), frame-to-frame registration (FFR), swath-to-swath registration (SSR), and within frame registration (WIFR) for the Level 1B image products. For GLM, the single metric of interest is the 3-sigma error in the navigation of background images (GLM NAV) used by the system to navigate lightning strikes. 3-sigma errors are estimates of the 99.73rd percentile of the errors accumulated over a 24-hour data collection period. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24-hour evaluation period. Another aspect of the IPATS design that vastly reduces execution time is the off-line propagation of Landsat based truth images to the fixed grid coordinates system for each of the three GOES-R satellite locations, operational East and West and initial checkout locations. This paper describes the algorithmic design and implementation of IPATS and provides preliminary test results.

  19. Citizen's Charter in a primary health-care setting of Nepal: An accountability tool or a "mere wall poster"?

    Science.gov (United States)

    Gurung, Gagan; Gauld, Robin; Hill, Philip C; Derrett, Sarah

    2018-02-01

    Despite some empirical findings on the usefulness of citizen's charters on awareness of rights and services, there is a dearth of literature about charter implementation and impact on health service delivery in low-income settings. To gauge the level of awareness of the Charter within Nepal's primary health-care (PHC) system, perceived impact and factors affecting Charter implementation. Using a case study design, a quantitative survey was administered to 400 participants from 22 of 39 PHC facilities in the Dang District to gauge awareness of the Charter. Additionally, qualitative interviews with 39 key informants were conducted to explore the perceived impact of the Charter and factors affecting its implementation. Few service users (15%) were aware of the existence of the Charter. Among these, a greater proportion were literate, and there were also differences according to ethnicity and occupational group. The Charter was usually not properly displayed and had been implemented with no prior public consultation. It contained information that provided awareness of health facility services, particularly the more educated public, but had limited potential for increasing transparency and holding service providers accountable to citizens. Proper display, consultation with stakeholders, orientation or training and educational factors, follow-up and monitoring, and provision of sanctions were all lacking, negatively influencing the implementation of the Charter. Poor implementation and low public awareness of the Charter limit its usefulness. Provision of sanctions and consultation with citizens in Charter development are needed to expand the scope of Charters from information brochures to tools for accountability. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  20. An integrated approach using high time-resolved tools to study the origin of aerosols

    International Nuclear Information System (INIS)

    Di Gilio, A.; Gennaro, G. de; Dambruoso, P.; Ventrella, G.

    2015-01-01

    and confirm the influence of aerosol transported from heavily polluted areas on the receptor site. - Highlights: • An integrated approach was developed to characterize local and LRT contributions to PM. • The use of high time-resolved tools allowed one to identify also time-limited events. • Hourly ion concentrations enabled an accurate characterization of high-PM events. • Hourly BT cluster analysis enabled the identification of the LRT pathways. • This approach is useful to understand air pollution phenomena and support decision making

  1. An integrated approach using high time-resolved tools to study the origin of aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Di Gilio, A. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Gennaro, G. de, E-mail: gianluigi.degennaro@uniba.it [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Dambruoso, P. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy); ARPA PUGLIA, Corso Trieste, 27, 70126 Bari (Italy); Ventrella, G. [Chemistry Department, University of Bari, via Orabona, 4, 70126 Bari (Italy)

    2015-10-15

    framework and confirm the influence of aerosol transported from heavily polluted areas on the receptor site. - Highlights: • An integrated approach was developed to characterize local and LRT contributions to PM. • The use of high time-resolved tools allowed one to identify also time-limited events. • Hourly ion concentrations enabled an accurate characterization of high-PM events. • Hourly BT cluster analysis enabled the identification of the LRT pathways. • This approach is useful to understand air pollution phenomena and support decision making.

  2. Technology Integration with Teacher Candidates in a Summer-Camp Setting

    Science.gov (United States)

    Pilgrim, Jodi; Berry, Joan

    2014-01-01

    Many districts have implemented one-to-one technology initiatives, where students have access to computers or tablets for use in and out of school. Teachers participating in these initiatives may lack knowledge about ways to integrate technology into classroom practices (Pilgrim and Bledsoe, 2012); therefore, teacher preparation programs must…

  3. Integrating Behavioral Health Support into a Pediatric Setting: What Happens in the Exam Room?

    Science.gov (United States)

    Cuno, Kate; Krug, Laura M.; Umylny, Polina

    2015-01-01

    This article presents an overview of the Healthy Steps for Young Children (Healthy Steps) program at Montefiore Medical Center, in the Bronx, NY. The authors review the theoretical underpinnings of this national program for the promotion of early childhood mental health. The Healthy Steps program at Montefiore is integrated into outpatient…

  4. Barriers to Successful Implementation of Technology Integration in Educational Settings: A Case Study

    Science.gov (United States)

    Laferrière, T.; Hamel, C.; Searson, M.

    2013-01-01

    Representing issues discussed at the EduSummIT 2011 relative to essential conditions and barriers to successful technology integration, this article presents a systemic analysis of barriers that needed to be overcome for an information technology initiative (Remote Networked School project) to be successfully implemented. The analysis was…

  5. Integration of Information and Communication Technology and Pupils' Motivation in a Physical Education Setting

    Science.gov (United States)

    Legrain, Pascal; Gillet, Nicolas; Gernigon, Christophe; Lafreniere, Marc-André

    2015-01-01

    The purpose of this study was to test an integrative model regarding the impact of information and communication technology (ICT) on achievement in physical education. Pupils' perceptions of autonomy-support from teacher, satisfaction of basic psychological needs, and self-determined motivation were considered to mediate the impact of ICT on…

  6. The Impact of Social Integration Interventions and Job Coaches in Work Settings.

    Science.gov (United States)

    Chadsey, Janis G.; Linneman, Dan; Rusch, Frank R.; Cimera, Robert E.

    1997-01-01

    A study investigated effects of two intervention strategies (contextual and coworker) on the social interactions and integration with peers of five workers with mental retardation. Neither intervention had a significant impact on the frequency of interactions; however, it appeared that the presence of a job coach suppressed interaction rates.…

  7. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    Directory of Open Access Journals (Sweden)

    Judith Kwasa

    Full Text Available To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD for use by primary health care workers (HCW which would be feasible to implement in resource-limited settings.In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need.A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic.The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20% of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65. This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  8. MODexplorer: an integrated tool for exploring protein sequence, structure and function relationships.

    KAUST Repository

    Kosinski, Jan

    2013-02-08

    SUMMARY: MODexplorer is an integrated tool aimed at exploring the sequence, structural and functional diversity in protein families useful in homology modeling and in analyzing protein families in general. It takes as input either the sequence or the structure of a protein and provides alignments with its homologs along with a variety of structural and functional annotations through an interactive interface. The annotations include sequence conservation, similarity scores, ligand-, DNA- and RNA-binding sites, secondary structure, disorder, crystallographic structure resolution and quality scores of models implied by the alignments to the homologs of known structure. MODexplorer can be used to analyze sequence and structural conservation among the structures of similar proteins, to find structures of homologs solved in different conformational state or with different ligands and to transfer functional annotations. Furthermore, if the structure of the query is not known, MODexplorer can be used to select the modeling templates taking all this information into account and to build a comparative model. AVAILABILITY AND IMPLEMENTATION: Freely available on the web at http://modorama.biocomputing.it/modexplorer. Website implemented in HTML and JavaScript with all major browsers supported. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

  9. The systems integration operations/logistics model as a decision-support tool

    International Nuclear Information System (INIS)

    Miller, C.; Vogel, L.W.; Joy, D.S.

    1989-01-01

    Congress has enacted legislation specifying Yucca Mountain, Nevada, for characterization as the candidate site for the disposal of spent fuel and high-level wastes and has authorized a monitored retrievable storage (MRS) facility if one is warranted. Nevertheless, the exact configuration of the facilities making up the Federal Waste Management System (FWMS) was not specified. This has left the Office of Civilian Radioactive Waste Management (OCRWM) the responsibility for assuring the design of a safe and reliable disposal system. In order to assist in the analysis of potential configuration alternatives, operating strategies, and other factors for the FWMS and its various elements, a decision-support tool known as the systems integration operations/logistics model (SOLMOD) was developed. SOLMOD is a discrete event simulation model that emulates the movement and interaction of equipment and radioactive waste as it is processed through the FWMS - from pickup at reactor pools to emplacement. The model can be used to measure the impacts of different operating schedules and rules, system configurations, and equipment and other resource availabilities on the performance of processes comprising the FWMS and how these factors combine to determine overall system performance. SOLMOD can assist in identifying bottlenecks and can be used to assess capacity utilization of specific equipment and staff as well as overall system resilience

  10. Recent enhancements of the INSIGHT integrated in-core fuel management tool

    International Nuclear Information System (INIS)

    Akio, Yamamoto

    2001-01-01

    Recent enhancements of the INSIGHT system are described in this paper. The INSIGHT system is an integrated in-core fuel management tool for pressurized water reactors (PWRs) runs on UNIX workstations. The INSIGHT system provides various capabilities which contribute to reduce fuel cycle cost and workload of in-core fuel management tasks, i.e. core follow calculations, interactive loading pattern design, automated multicycle analysis and interface between detailed core calculation codes. To minimize engineers' workload, most of input data for analysis modules are automatically generated by the INSIGHT system through specification of calculation conditions in the graphic user interface. Recent enhancements of the INSIGHT system are mainly focused to improve efficiency of loading pattern optimization and flexibility of multicycle analyses. To increase optimization efficiency, a parallel calculation capability, various optimization theories, extension of heuristic rules, screening by neural networks and so on were incorporated in the loading pattern optimization module. The multicycle analyses module was rewritten to increase flexibility such as cycle dependent specification of loading pattern search methods and so on. The INSIGHT system is currently used by Japanese utilities not only for regular in-core fuel management tasks but also for strategic fuel management studies to reduce fuel cycle cost

  11. Planning principles as integral tool of financial planning implementation at enterprise

    Directory of Open Access Journals (Sweden)

    A.V. Overchuk

    2016-09-01

    Full Text Available The main problems of using financial planning tools, namely its principles, in order to achieve an effective system of financial planning at the enterprise are considered. Planning is an integral part of management and provides achievement of sustainable enterprise development, that is why, effective financial planning is the necessary means of realization of the main objective of the company – profit maximization. It is determined that the principles of financial planning are the objective category of planning science, which serves as a starting fundamental concept that expresses the cumulative effect of a number of laws. In addition, on the base of the aggregation and analysis of planning principles the system of financial planning principles is specified, providing efficient financial planning and control functions use, achievement of the mission and objectives of the economic entity activity management and ensuring of its profitability. The scientific novelty of the research lies in the specification of financial planning principles defining at the enterprise and their semantic characteristics

  12. MODexplorer: an integrated tool for exploring protein sequence, structure and function relationships.

    KAUST Repository

    Kosinski, Jan; Barbato, Alessandro; Tramontano, Anna

    2013-01-01

    SUMMARY: MODexplorer is an integrated tool aimed at exploring the sequence, structural and functional diversity in protein families useful in homology modeling and in analyzing protein families in general. It takes as input either the sequence or the structure of a protein and provides alignments with its homologs along with a variety of structural and functional annotations through an interactive interface. The annotations include sequence conservation, similarity scores, ligand-, DNA- and RNA-binding sites, secondary structure, disorder, crystallographic structure resolution and quality scores of models implied by the alignments to the homologs of known structure. MODexplorer can be used to analyze sequence and structural conservation among the structures of similar proteins, to find structures of homologs solved in different conformational state or with different ligands and to transfer functional annotations. Furthermore, if the structure of the query is not known, MODexplorer can be used to select the modeling templates taking all this information into account and to build a comparative model. AVAILABILITY AND IMPLEMENTATION: Freely available on the web at http://modorama.biocomputing.it/modexplorer. Website implemented in HTML and JavaScript with all major browsers supported. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

  13. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    International Nuclear Information System (INIS)

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-01-01

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data

  14. Human Ageing Genomic Resources: Integrated databases and tools for the biology and genetics of ageing

    Science.gov (United States)

    Tacutu, Robi; Craig, Thomas; Budovsky, Arie; Wuttke, Daniel; Lehmann, Gilad; Taranukha, Dmitri; Costa, Joana; Fraifeld, Vadim E.; de Magalhães, João Pedro

    2013-01-01

    The Human Ageing Genomic Resources (HAGR, http://genomics.senescence.info) is a freely available online collection of research databases and tools for the biology and genetics of ageing. HAGR features now several databases with high-quality manually curated data: (i) GenAge, a database of genes associated with ageing in humans and model organisms; (ii) AnAge, an extensive collection of longevity records and complementary traits for >4000 vertebrate species; and (iii) GenDR, a newly incorporated database, containing both gene mutations that interfere with dietary restriction-mediated lifespan extension and consistent gene expression changes induced by dietary restriction. Since its creation about 10 years ago, major efforts have been undertaken to maintain the quality of data in HAGR, while further continuing to develop, improve and extend it. This article briefly describes the content of HAGR and details the major updates since its previous publications, in terms of both structure and content. The completely redesigned interface, more intuitive and more integrative of HAGR resources, is also presented. Altogether, we hope that through its improvements, the current version of HAGR will continue to provide users with the most comprehensive and accessible resources available today in the field of biogerontology. PMID:23193293

  15. Mendeley as an integral tool in the arsenal of modern scientist

    Directory of Open Access Journals (Sweden)

    Taras Kotyk

    2016-11-01

    Full Text Available This paper presents the possibilities of Mendeley – a reference manager and social network for researchers. The key aspects of using this software as an effective reference manager as well as a tool for organizing full-text archive of publications and processing scientific sources when conducting research are highlighted. The possibilities of Mendeley as a social network, namely a means of communication and collaboration between researchers, sharing of reference database and search for new scientific publications are presented as well. In general, Mendeley, due to its functionality, is an integral part of the scientific research carried out by students, scientists or laboratory research groups. The use of Mendeley by all members of the research project will allow them to effectively search for original sources and analyze them; to quickly create the reference list according to different styles; to follow other researchers in order to view relevant papers; to greatly enhance the quality of the research; to expand the potential readership of their publications.

  16. Sensory Integration Training Tool Design for Children with Autism Spectrum Disorder

    Directory of Open Access Journals (Sweden)

    Jiang Lijun

    2017-01-01

    Full Text Available This study aims to design a training tool for therapy of children with autism spectrum disorder (ASDs. Typically, ASDs pass through obstacle track several times with sandbags, which should be picked up from starting point and threw into a box at the end during sensory integration therapy. Counting the sandbags can help ASDs to have concept about the progress of mission. We redesign the counting box named “Skybox” which can help counting by playing sound after detect something throw in it. Aims to probe into the sound preference of two main subjects, an experiment with four kinds of sounds is conducted in this paper by using the method of paired comparisons. The result shows they like animals most, followed by human voice and nature sounds, and music instrument is the last. The material preference experiment shows two subjects like acrylic most, wood and paper are secondary while furry is the last. Skybox shortens their training time for 23.53%, 29.87% and 37.37% in three different projects. We consider that Skybox attracts ASDs therefore reduces their distraction and improves their performance in the usability test.

  17. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com [Amirkabir University of Technology, PhD Student at Reservoir Engineering, Department of Petroleum Engineering (Iran, Islamic Republic of); Khamehchi, Ehsan [Amirkabir University of Technology, Faculty of Petroleum Engineering (Iran, Islamic Republic of)

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  18. SNP-RFLPing 2: an updated and integrated PCR-RFLP tool for SNP genotyping

    Directory of Open Access Journals (Sweden)

    Chang Hsueh-Wei

    2010-04-01

    Full Text Available Abstract Background PCR-restriction fragment length polymorphism (RFLP assay is a cost-effective method for SNP genotyping and mutation detection, but the manual mining for restriction enzyme sites is challenging and cumbersome. Three years after we constructed SNP-RFLPing, a freely accessible database and analysis tool for restriction enzyme mining of SNPs, significant improvements over the 2006 version have been made and incorporated into the latest version, SNP-RFLPing 2. Results The primary aim of SNP-RFLPing 2 is to provide comprehensive PCR-RFLP information with multiple functionality about SNPs, such as SNP retrieval to multiple species, different polymorphism types (bi-allelic, tri-allelic, tetra-allelic or indels, gene-centric searching, HapMap tagSNPs, gene ontology-based searching, miRNAs, and SNP500Cancer. The RFLP restriction enzymes and the corresponding PCR primers for the natural and mutagenic types of each SNP are simultaneously analyzed. All the RFLP restriction enzyme prices are also provided to aid selection. Furthermore, the previously encountered updating problems for most SNP related databases are resolved by an on-line retrieval system. Conclusions The user interfaces for functional SNP analyses have been substantially improved and integrated. SNP-RFLPing 2 offers a new and user-friendly interface for RFLP genotyping that can be used in association studies and is freely available at http://bio.kuas.edu.tw/snp-rflping2.

  19. SNP-RFLPing 2: an updated and integrated PCR-RFLP tool for SNP genotyping.

    Science.gov (United States)

    Chang, Hsueh-Wei; Cheng, Yu-Huei; Chuang, Li-Yeh; Yang, Cheng-Hong

    2010-04-08

    PCR-restriction fragment length polymorphism (RFLP) assay is a cost-effective method for SNP genotyping and mutation detection, but the manual mining for restriction enzyme sites is challenging and cumbersome. Three years after we constructed SNP-RFLPing, a freely accessible database and analysis tool for restriction enzyme mining of SNPs, significant improvements over the 2006 version have been made and incorporated into the latest version, SNP-RFLPing 2. The primary aim of SNP-RFLPing 2 is to provide comprehensive PCR-RFLP information with multiple functionality about SNPs, such as SNP retrieval to multiple species, different polymorphism types (bi-allelic, tri-allelic, tetra-allelic or indels), gene-centric searching, HapMap tagSNPs, gene ontology-based searching, miRNAs, and SNP500Cancer. The RFLP restriction enzymes and the corresponding PCR primers for the natural and mutagenic types of each SNP are simultaneously analyzed. All the RFLP restriction enzyme prices are also provided to aid selection. Furthermore, the previously encountered updating problems for most SNP related databases are resolved by an on-line retrieval system. The user interfaces for functional SNP analyses have been substantially improved and integrated. SNP-RFLPing 2 offers a new and user-friendly interface for RFLP genotyping that can be used in association studies and is freely available at http://bio.kuas.edu.tw/snp-rflping2.

  20. Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Water Security

    Science.gov (United States)

    Hakimdavar, Raha; Wood, Danielle; Eylander, John; Peters-Lidard, Christa; Smith, Jane; Doorn, Brad; Green, David; Hummel, Corey; Moore, Thomas C.

    2018-01-01

    River basins for which transboundary coordination and governance is a factor are of concern to US national security, yet there is often a lack of sufficient data-driven information available at the needed time horizons to inform transboundary water decision-making for the intelligence, defense, and foreign policy communities. To address this need, a two-day workshop entitled Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Global Water Security was held in August 2017 in Maryland. The committee that organized and convened the workshop (the Organizing Committee) included representatives from the National Aeronautics and Space Administration (NASA), the US Army Corps of Engineers Engineer Research and Development Center (ERDC), and the US Air Force. The primary goal of the workshop was to advance knowledge on the current US Government and partners' technical information needs and gaps to support national security interests in relation to transboundary water. The workshop also aimed to identify avenues for greater communication and collaboration among the scientific, intelligence, defense, and foreign policy communities. The discussion around transboundary water was considered in the context of the greater global water challenges facing US national security.