WorldWideScience

Sample records for integrated tool set

  1. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  2. Integrated Radiation Analysis and Design Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Radiation Analysis and Design Tools (IRADT) Project develops and maintains an integrated tool set that collects the current best practices, databases,...

  3. Idea: an integrated set of tools for sustainable nuclear decommissioning projects

    International Nuclear Information System (INIS)

    Detilleux, M.; Centner, B.; Vanderperre, S.; Wacquier, W.

    2008-01-01

    Decommissioning of nuclear installations constitutes an important challenge and shall prove to the public that the whole nuclear life cycle is fully mastered by the nuclear industry. This could lead to an easier public acceptance of the construction of new nuclear power plants. When ceasing operation, nuclear installations owners and operators are looking for solutions in order to assess and keep decommissioning costs at a reasonable level, to fully characterise waste streams (in particular radiological inventories of difficult-to-measure radionuclides) and to reduce personnel exposure during the decommissioning activities taking into account several project, site and country specific constraints. In response to this need, Tractebel Engineering has developed IDEA (Integrated DEcommissioning Application), an integrated set of computer tools, to support the engineering activities to be carried out in the frame of a decommissioning project. IDEA provides optimized solutions from an economical, environmental, social and safety perspective. (authors)

  4. VLM Tool for IDS Integration

    Directory of Open Access Journals (Sweden)

    Cǎtǎlin NAE

    2010-03-01

    Full Text Available This paper is dedicated to a very specific type of analysis tool (VLM - Vortex Lattice Method to be integrated in a IDS - Integrated Design System, tailored for the usage of small aircraft industry. The major interest is to have the possibility to simulate at very low computational costs a preliminary set of aerodynamic characteristics for basic aerodynamic global characteristics (Lift, Drag, Pitching Moment and aerodynamic derivatives for longitudinal and lateral-directional stability analysis. This work enables fast investigations of the influence of configuration changes in a very efficient computational environment. Using experimental data and/or CFD information for a specific calibration of VLM method, reliability of the analysis may me increased so that a first type (iteration zero aerodynamic evaluation of the preliminary 3D configuration is possible. The output of this tool is basic state aerodynamic and associated stability and control derivatives, as well as a complete set of information on specific loads on major airframe components.The major interest in using and validating this type of methods is coming from the possibility to integrate it as a tool in an IDS system for conceptual design phase, as considered for development for CESAR project (IP, UE FP6.

  5. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  6. The Overture Initiative Integrating Tools for VDM

    DEFF Research Database (Denmark)

    Larsen, Peter Gorm; Battle, Nick; Ferreira, Miguel

    2010-01-01

    Overture is a community-based initiative that aims to develop a common open-source platform integrating a range of tools for constructing and analysing formal models of systems using VDM. The mission is to both provide an industrial-strength tool set for VDM and also to provide an environment...

  7. Integration of g4tools in Geant4

    International Nuclear Information System (INIS)

    Hřivnáčová, Ivana

    2014-01-01

    g4tools, that is originally part of the inlib and exlib packages, provides a very light and easy to install set of C++ classes that can be used to perform analysis in a Geant4 batch program. It allows to create and manipulate histograms and ntuples, and write them in supported file formats (ROOT, AIDA XML, CSV and HBOOK). It is integrated in Geant4 through analysis manager classes, thus providing a uniform interface to the g4tools objects and also hiding the differences between the classes for different supported output formats. Moreover, additional features, such as for example histogram activation or support for Geant4 units, are implemented in the analysis classes following users requests. A set of Geant4 user interface commands allows the user to create histograms and set their properties interactively or in Geant4 macros. g4tools was first introduced in the Geant4 9.5 release where its use was demonstrated in one basic example, and it is already used in a majority of the Geant4 examples within the Geant4 9.6 release. In this paper, we will give an overview and the present status of the integration of g4tools in Geant4 and report on upcoming new features.

  8. APMS: An Integrated Set of Tools for Measuring Safety

    Science.gov (United States)

    Statler, Irving C.; Reynard, William D. (Technical Monitor)

    1996-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  9. Set-Valued Stochastic Equation with Set-Valued Square Integrable Martingale

    Directory of Open Access Journals (Sweden)

    Li Jun-Gang

    2017-01-01

    Full Text Available In this paper, we shall introduce the stochastic integral of a stochastic process with respect to set-valued square integrable martingale. Then we shall give the Aumann integral measurable theorem, and give the set-valued stochastic Lebesgue integral and set-valued square integrable martingale integral equation. The existence and uniqueness of solution to set-valued stochastic integral equation are proved. The discussion will be useful in optimal control and mathematical finance in psychological factors.

  10. Activity-Centred Tool Integration

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius

    2003-01-01

    This paper is concerned with integration of heterogeneous tools for system development. We argue that such tools should support concrete activities (e.g., programming, unit testing, conducting workshops) in contrast to abstract concerns (e.g., analysis, design, implementation). A consequence of t...... of this is that tools — or components —that support activities well should be integrated in ad-hoc, dynamic, and heterogeneous ways. We present a peer-to-peer architecture for this based on type-based publish subscribe and give an example of its use....

  11. Wind Integration Data Sets | Grid Modernization | NREL

    Science.gov (United States)

    Wind Integration Data Sets Wind Integration Data Sets NREL's wind integration data sets provide the Integration Data Sets Ten-minute time-series wind data for 2004, 2005, and 2006 to help energy professionals perform wind integration studies and estimate power production from hypothetical wind power plants. Access

  12. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  13. Open environments to support systems engineering tool integration: A study using the Portable Common Tool Environment (PCTE)

    Science.gov (United States)

    Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.

    1993-01-01

    A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.

  14. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Hahmann, Andrea N.; Nielsen, T. S.

    This poster describes the status as of April 2012 of the Public Service Obligation (PSO) funded project PSO 10464 \\Integrated Wind Power Planning Tool". The project goal is to integrate a meso scale numerical weather prediction (NWP) model with a statistical tool in order to better predict short...... term power variation from off shore wind farms, as well as to conduct forecast error assessment studies in preparation for later implementation of such a feature in an existing simulation model. The addition of a forecast error estimation feature will further increase the value of this tool, as it...

  15. Interactive Electronic Decision Trees for the Integrated Primary Care Management of Febrile Children in Low Resource Settings - Review of existing tools.

    Science.gov (United States)

    Keitel, Kristina; D'Acremont, Valérie

    2018-04-20

    The lack of effective, integrated diagnostic tools pose a major challenge to the primary care management of febrile childhood illnesses. These limitations are especially evident in low-resource settings and are often inappropriately compensated by antimicrobial over-prescription. Interactive electronic decision trees (IEDTs) have the potential to close these gaps: guiding antibiotic use and better identifying serious disease. This narrative review summarizes existing IEDTs, to provide an overview of their degree of validation, as well as to identify gaps in current knowledge and prospects for future innovation. Structured literature review in PubMed and Embase complemented by google search and contact with developers. Six integrated IEDTs were identified: three (eIMCI, REC, and Bangladesh digital IMCI) based on Integrated Management of Childhood Illnesses (IMCI); four (SL eCCM, MEDSINC, e-iCCM, and D-Tree eCCM) on Integrated Community Case Management (iCCM); two (ALMANACH, MSFeCARE) with a modified IMCI content; and one (ePOCT) that integrates novel content with biomarker testing. The types of publications and evaluation studies varied greatly: the content and evidence-base was published for two (ALMANACH and ePOCT), ALMANACH and ePOCT were validated in efficacy studies. Other types of evaluations, such as compliance, acceptability were available for D-Tree eCCM, eIMCI, ALMANACH. Several evaluations are still ongoing. Future prospects include conducting effectiveness and impact studies using data gathered through larger studies to adapt the medical content to local epidemiology, improving the software and sensors, and Assessing factors that influence compliance and scale-up. IEDTs are valuable tools that have the potential to improve management of febrile children in primary care and increase the rational use of diagnostics and antimicrobials. Next steps in the evidence pathway should be larger effectiveness and impact studies (including cost analysis) and

  16. A tool to guide the process of integrating health system responses to public health problems

    Directory of Open Access Journals (Sweden)

    Tilahun Nigatu Haregu

    2015-06-01

    Full Text Available An integrated model of health system responses to public health problems is considered to be the most preferable approach. Accordingly, there are several models that stipulate what an integrated architecture should look like. However, tools that can guide the overall process of integration are lacking. This tool is designed to guide the entire process of integration of health system responses to major public health problems. It is developed by taking into account the contexts of health systems of developing countries and the emergence of double-burden of chronic diseases in these settings. Chronic diseases – HIV/AIDS and NCDs – represented the evidence base for the development of the model. System level horizontal integration of health system responses were considered in the development of this tool.

  17. Solar Integration Data Sets | Grid Modernization | NREL

    Science.gov (United States)

    Solar Integration Data Sets Solar Integration Data Sets NREL provides the energy community with for Integration Studies Modeled solar data for energy professionals-such as transmission planners , utility planners, project developers, and university researchers-who perform solar integration studies and

  18. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  19. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  20. Ontology-based integration of topographic data sets

    NARCIS (Netherlands)

    Uitermark, HT; van Oosterom, PJM; Mars, NJI; Molenaar, M

    The integration of topographic data sets is defined as the process of establishing relationships between corresponding object instances in different, autonomously produced, topographic data sets of the same geographic space. The problem of integrating topographic data sets is in finding these

  1. Set-Valued Stochastic Lebesque Integral and Representation Theorems

    Directory of Open Access Journals (Sweden)

    Jungang Li

    2008-06-01

    Full Text Available In this paper, we shall firstly illustrate why we should introduce set-valued stochastic integrals, and then we shall discuss some properties of set-valued stochastic processes and the relation between a set-valued stochastic process and its selection set. After recalling the Aumann type definition of stochastic integral, we shall introduce a new definition of Lebesgue integral of a set-valued stochastic process with respect to the time t . Finally we shall prove the presentation theorem of set-valued stochastic integral and dis- cuss further properties that will be useful to study set-valued stochastic differential equations with their applications.

  2. Integrating Risk Analyses and Tools at the DOE Hanford Site

    International Nuclear Information System (INIS)

    LOBER, R.W.

    2002-01-01

    Risk assessment and environmental impact analysis at the U.S. Department of Energy (DOE) Hanford Site in Washington State has made significant progress in refining the strategy for using risk analysis to support closing of several hundred waste sites plus 149 single-shell tanks at the Hanford Site. A Single-Shell Tank System Closure Work Plan outlines the current basis for closing the single-shell tank systems. An analogous site approach has been developed to address closure of aggregated groups of similar waste sites. Because of the complexity, decision time frames, proximity of non-tank farm waste sites to tank farms, scale, and regulatory considerations, various projects are providing integrated assessments to support risk analyses and decision-making. Projects and the tools that are being developed and applied at Hanford to support retrieval and cleanup decisions include: (1) Life Cycle Model (LCM) and Risk Receptor Model (RRM)--A site-level set of tools to support strategic analyses through scoping level risk management to assess different alternatives and options for tank closure. (2) Systems Assessment Capability for Integrated Groundwater Nadose Zone (SAC) and the Site-Wide Groundwater Model (SWGM)--A site-wide groundwater modeling system coupled with a risk-based uncertainty analysis of inventory, vadose zone, groundwater, and river interactions for evaluating cumulative impacts from individual and aggregate waste sites. (3) Retrieval Performance Evaluation (RPE)--A site-specific, risk-based methodology developed to evaluate performance of waste retrieval, leak detection and closure on a tank-specific basis as a function of past tank Leaks, potential leakage during retrieval operations, and remaining residual waste inventories following completion of retrieval operations. (4) Field Investigation Report (FIR)--A corrective action program to investigate the nature and extent of past tank leaks through characterization activities and assess future impacts to

  3. Debating Life on Mars: The Knowledge Integration Environment (KIE) in Varied School Settings.

    Science.gov (United States)

    Shear, Linda

    Technology-enabled learning environments are beginning to come of age. Tools and frameworks are now available that have been shown to improve learning and are being deployed more widely in varied school settings. Teachers are now faced with the formidable challenge of integrating these promising new environments with the everyday context in which…

  4. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  5. IDMT, Integrated Decommissioning Management Tools

    International Nuclear Information System (INIS)

    Alemberti, A.; Castagna, P.; Marsiletti, M.; Orlandi, S.; Perasso, L.; Susco, M.

    2005-01-01

    Nuclear Power Plant decommissioning requires a number of demolition activities related to civil works and systems as well as the construction of temporary facilities used for treatment and conditioning of the dismantled parts. The presence of a radiological, potentially hazardous, environment due to the specific configuration and history of the plant require a professional, expert and qualified approach approved by the national safety authority. Dismantling activities must be designed, planned and analysed in detail during an evaluation phase taking into account different scenarios generated by possible dismantling sequences and specific waste treatments to be implemented. The optimisation process of the activities becomes very challenging taking into account the requirement of the minimisation of the radiological impact on exposed workers and people during normal and accident conditions. While remote operated equipment, waste treatment and conditioning facilities may be designed taking into account this primary goal also a centralised management system and corresponding software tools have to be designed and operated in order to guarantee the fulfilment of the imposed limits as well as the traceability of wastes. Ansaldo Nuclear Division has been strongly involved in the development of a qualified and certified software environment to manage the most critical activities of a decommissioning project. The IDMT system (Integrated Decommissioning Management Tools) provide a set of stand alone user friendly applications able to work in an integrated configuration to guarantee waste identification, traceability during treatment and conditioning process as well as location and identification at the Final Repository site. Additionally, the system can be used to identify, analyse and compare different specific operating scenarios to be optimised in term of both economical and radiological considerations. The paper provides an overview of the different phases of

  6. Evaluation of Oracle Big Data Integration Tools

    OpenAIRE

    Urhan, Harun; Baranowski, Zbigniew

    2015-01-01

    Abstract The project’s objective is evaluating Oracle’s Big Data Integration Tools. The project covers evaluation of two of Oracle’s tools, Oracle Data Integrator: Application Adapters for Hadoop to load data from Oracle Database to Hadoop and Oracle SQL Connectors for HDFS to query data stored on a Hadoop file system by using SQL statements executed on an Oracle Database.

  7. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov (United States)

    Computing | NREL Git Software Tool on Windows Installing and Setting Up Git Software Tool on Windows Learn how to set up the Git software tool on Windows for use with the Peregrine system. Git is this doc, we'll show you how to get git installed on Windows 7, and how to get things set up on NREL's

  8. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    Science.gov (United States)

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  9. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  10. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  11. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Directory of Open Access Journals (Sweden)

    Esther Suter

    2017-11-01

    Full Text Available Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework.

  12. Indicators and Measurement Tools for Health Systems Integration: A Knowledge Synthesis

    Science.gov (United States)

    Oelke, Nelly D.; da Silva Lima, Maria Alice Dias; Stiphout, Michelle; Janke, Robert; Witt, Regina Rigatto; Van Vliet-Brown, Cheryl; Schill, Kaela; Rostami, Mahnoush; Hepp, Shelanne; Birney, Arden; Al-Roubaiai, Fatima; Marques, Giselda Quintana

    2017-01-01

    Background: Despite far reaching support for integrated care, conceptualizing and measuring integrated care remains challenging. This knowledge synthesis aimed to identify indicator domains and tools to measure progress towards integrated care. Methods: We used an established framework and a Delphi survey with integration experts to identify relevant measurement domains. For each domain, we searched and reviewed the literature for relevant tools. Findings: From 7,133 abstracts, we retrieved 114 unique tools. We found many quality tools to measure care coordination, patient engagement and team effectiveness/performance. In contrast, there were few tools in the domains of performance measurement and information systems, alignment of organizational goals and resource allocation. The search yielded 12 tools that measure overall integration or three or more indicator domains. Discussion: Our findings highlight a continued gap in tools to measure foundational components that support integrated care. In the absence of such targeted tools, “overall integration” tools may be useful for a broad assessment of the overall state of a system. Conclusions: Continued progress towards integrated care depends on our ability to evaluate the success of strategies across different levels and context. This study has identified 114 tools that measure integrated care across 16 domains, supporting efforts towards a unified measurement framework. PMID:29588637

  13. Integrated Wind Power Planning Tool

    DEFF Research Database (Denmark)

    Rosgaard, M. H.; Giebel, Gregor; Nielsen, T. S.

    2012-01-01

    model to be developed in collaboration with ENFOR A/S; a danish company that specialises in forecasting and optimisation for the energy sector. This integrated prediction model will allow for the description of the expected variability in wind power production in the coming hours to days, accounting......This poster presents the current state of the public service obligation (PSO) funded project PSO 10464, with the working title "Integrated Wind Power Planning Tool". The project commenced October 1, 2011, and the goal is to integrate a numerical weather prediction (NWP) model with purely...

  14. Integrated Tools for Future Distributed Engine Control Technologies

    Science.gov (United States)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  15. An Integrative Review of Pediatric Fall Risk Assessment Tools.

    Science.gov (United States)

    DiGerolamo, Kimberly; Davis, Katherine Finn

    Patient fall prevention begins with accurate risk assessment. However, sustained improvements in prevention and quality of care include use of validated fall risk assessment tools (FRATs). The goal of FRATs is to identify patients at highest risk. Adult FRATs are often borrowed from to create tools for pediatric patients. Though factors associated with pediatric falls in the hospital setting are similar to those in adults, such as mobility, medication use, and cognitive impairment, adult FRATs and the factors associated with them do not adequately assess risk in children. Articles were limited to English language, ages 0-21years, and publish date 2006-2015. The search yielded 22 articles. Ten were excluded as the population was primarily adult or lacked discussion of a FRAT. Critical appraisal and findings were synthesized using the Johns Hopkins Nursing evidence appraisal system. Twelve articles relevant to fall prevention in the pediatric hospital setting that discussed fall risk assessment and use of a FRAT were reviewed. Comparison between and accuracy of FRATs is challenged when different classifications, definitions, risk stratification, and inclusion criteria are used. Though there are several pediatric FRATs published in the literature, none have been found to be reliable and valid across institutions and diverse populations. This integrative review highlights the importance of choosing a FRAT based on an institution's identified risk factors and validating the tool for one's own patient population as well as using the tool in conjunction with nursing clinical judgment to guide interventions. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Aumann Type Set-valued Lebesgue Integral and Representation Theorem

    Directory of Open Access Journals (Sweden)

    Jungang Li

    2009-03-01

    Full Text Available n this paper, we shall firstly illustrate why we should discuss the Aumann type set-valued Lebesgue integral of a set-valued stochastic process with respect to time t under the condition that the set-valued stochastic process takes nonempty compact subset of d -dimensional Euclidean space. After recalling some basic results about set-valued stochastic processes, we shall secondly prove that the Aumann type set-valued Lebesgue integral of a set-valued stochastic process above is a set-valued stochastic process. Finally we shall give the representation theorem, and prove an important inequality of the Aumann type set-valued Lebesgue integrals of set-valued stochastic processes with respect to t , which are useful to study set-valued stochastic differential inclusions with applications in finance.

  17. An integrated audio-visual impact tool for wind turbine installations

    International Nuclear Information System (INIS)

    Lymberopoulos, N.; Belessis, M.; Wood, M.; Voutsinas, S.

    1996-01-01

    An integrated software tool was developed for the design of wind parks that takes into account their visual and audio impact. The application is built on a powerful hardware platform and is fully operated through a graphic user interface. The topography, the wind turbines and the daylight conditions are realised digitally. The wind park can be animated in real time and the user can take virtual walks in it while the set-up of the park can be altered interactively. In parallel, the wind speed levels on the terrain, the emitted noise intensity, the annual energy output and the cash flow can be estimated at any stage of the session and prompt the user for rearrangements. The tool has been used to visually simulate existing wind parks in St. Breok, UK and Andros Island, Greece. The results lead to the conclusion that such a tool can assist to the public acceptance and licensing procedures of wind parks. (author)

  18. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    The recently adopted Sustainable Development Goals (SDGs) - a set of 17 measurable and time-bound goals with 169 associated targets for 2030 - are highly inclusive challenges before the world community ranging from eliminating poverty to human rights, inequality, a secure world and protection of the environment. Each individual goal or target by themselves present enormous tasks, taken together they are overwhelming. There strong and weak interlinkages, hence trade-offs and complementarities among goals and targets. Some targets may affect several goals while other goals and targets may conflict or be mutually exclusive (Ref). Meeting each of these requires the judicious exploitation of resource, with energy playing an important role. Such complexity demands to be addressed in an integrated way using systems analysis tools to support informed policy formulation, planning, allocation of scarce resources, monitoring progress, effectiveness and review at different scales. There is no one size fits all methodology that conceivably could include all goal and targets simultaneously. But there are methodologies encapsulating critical subsets of the goal and targets with strong interlinkages with a 'soft' reflection on the weak interlinkages. Universal food security or sustainable energy for all inherently support goals and targets on human rights and equality but possibly at the cost of biodiversity or desertification. Integrated analysis and planning tools are not yet commonplace at national universities - or indeed in many policy making organs. What is needed is a fundamental realignment of institutions and integrations of their planning processes and decision making. We introduce a series of open source tools to support the SDG planning and implementation process. The Global User-friendly CLEW Open Source (GLUCOSE) tool optimizes resource interactions and constraints; The Global Electrification Tool kit (GETit) provides the first global spatially explicit

  19. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  20. Ontology-based geographic data set integration

    NARCIS (Netherlands)

    Uitermark, H.T.J.A.; Uitermark, Harry T.; Oosterom, Peter J.M.; Mars, Nicolaas; Molenaar, Martien; Molenaar, M.

    1999-01-01

    In order to develop a system to propagate updates we investigate the semantic and spatial relationships between independently produced geographic data sets of the same region (data set integration). The goal of this system is to reduce operator intervention in update operations between corresponding

  1. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  2. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  3. Competency-based evaluation tools for integrative medicine training in family medicine residency: a pilot study

    Directory of Open Access Journals (Sweden)

    Schneider Craig

    2007-04-01

    Full Text Available Abstract Background As more integrative medicine educational content is integrated into conventional family medicine teaching, the need for effective evaluation strategies grows. Through the Integrative Family Medicine program, a six site pilot program of a four year residency training model combining integrative medicine and family medicine training, we have developed and tested a set of competency-based evaluation tools to assess residents' skills in integrative medicine history-taking and treatment planning. This paper presents the results from the implementation of direct observation and treatment plan evaluation tools, as well as the results of two Objective Structured Clinical Examinations (OSCEs developed for the program. Methods The direct observation (DO and treatment plan (TP evaluation tools developed for the IFM program were implemented by faculty at each of the six sites during the PGY-4 year (n = 11 on DO and n = 8 on TP. The OSCE I was implemented first in 2005 (n = 6, revised and then implemented with a second class of IFM participants in 2006 (n = 7. OSCE II was implemented in fall 2005 with only one class of IFM participants (n = 6. Data from the initial implementation of these tools are described using descriptive statistics. Results Results from the implementation of these tools at the IFM sites suggest that we need more emphasis in our curriculum on incorporating spirituality into history-taking and treatment planning, and more training for IFM residents on effective assessment of readiness for change and strategies for delivering integrative medicine treatment recommendations. Focusing our OSCE assessment more narrowly on integrative medicine history-taking skills was much more effective in delineating strengths and weaknesses in our residents' performance than using the OSCE for both integrative and more basic communication competencies. Conclusion As these tools are refined further they will be of value both in improving

  4. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    Science.gov (United States)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  5. Tools for integrating environmental objectives into policy and practice: What works where?

    Energy Technology Data Exchange (ETDEWEB)

    Runhaar, Hens

    2016-07-15

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  6. Tools for integrating environmental objectives into policy and practice: What works where?

    International Nuclear Information System (INIS)

    Runhaar, Hens

    2016-01-01

    An abundance of approaches, strategies, and instruments – in short: tools – have been developed that intend to stimulate or facilitate the integration of a variety of environmental objectives into development planning, national or regional sectoral policies, international agreements, business strategies, etc. These tools include legally mandatory procedures, such as Environmental Impact Assessment and Strategic Environmental Assessment; more voluntary tools such as environmental indicators developed by scientists and planning tools; green budgeting, etc. A relatively underexplored question is what integration tool fits what particular purposes and contexts, in short: “what works where?”. This paper intends to contribute to answering this question, by first providing conceptual clarity about what integration entails, by suggesting and illustrating a classification of integration tools, and finally by summarising some of the lessons learned about how and why integration tools are (not) used and with what outcomes, particularly in terms of promoting the integration of environmental objectives.

  7. Integrating New Technologies and Existing Tools to Promote Programming Learning

    Directory of Open Access Journals (Sweden)

    Álvaro Santos

    2010-04-01

    Full Text Available In recent years, many tools have been proposed to reduce programming learning difficulties felt by many students. Our group has contributed to this effort through the development of several tools, such as VIP, SICAS, OOP-Anim, SICAS-COL and H-SICAS. Even though we had some positive results, the utilization of these tools doesn’t seem to significantly reduce weaker student’s difficulties. These students need stronger support to motivate them to get engaged in learning activities, inside and outside classroom. Nowadays, many technologies are available to create contexts that may help to accomplish this goal. We consider that a promising path goes through the integration of solutions. In this paper we analyze the features, strengths and weaknesses of the tools developed by our group. Based on these considerations we present a new environment, integrating different types of pedagogical approaches, resources, tools and technologies for programming learning support. With this environment, currently under development, it will be possible to review contents and lessons, based on video and screen captures. The support for collaborative tasks is another key point to improve and stimulate different models of teamwork. The platform will also allow the creation of various alternative models (learning objects for the same subject, enabling personalized learning paths adapted to each student knowledge level, needs and preferential learning styles. The learning sequences will work as a study organizer, following a suitable taxonomy, according to student’s cognitive skills. Although the main goal of this environment is to support students with more difficulties, it will provide a set of resources supporting the learning of more advanced topics. Software engineering techniques and representations, object orientation and event programming are features that will be available in order to promote the learning progress of students.

  8. Integrated Land-Water-Energy assessment using the Foreseer Tool

    Science.gov (United States)

    Allwood, Julian; Konadu, Dennis; Mourao, Zenaida; Lupton, Rick; Richards, Keith; Fenner, Richard; Skelton, Sandy; McMahon, Richard

    2016-04-01

    This study presents an integrated energy and resource modelling and visualisation approach, ForeseerTM, which characterises the interdependencies and evaluates the land and water requirement for energy system pathways. The Foreseer Tool maps linked energy, water and land resource futures by outputting a set of Sankey diagrams for energy, water and land, showing the flow from basic resource (e.g. coal, surface water, and forested land) through transformations (e.g. fuel refining and desalination) to final services (e.g. sustenance, hygiene and transportation). By 'mapping' resources in this way, policy-makers can more easily understand the competing uses through the identification of the services it delivers (e.g. food production, landscaping, energy), the potential opportunities for improving the management of the resource and the connections with other resources which are often overlooked in a traditional sector-based management strategy. This paper will present a case study of the UK Carbon Plan, and highlights the need for integrated resource planning and policy development.

  9. Nutrition screening tools: Does one size fit all? A systematic review of screening tools for the hospital setting

    NARCIS (Netherlands)

    van Bokhorst-de van der Schueren, M.A.E.; Guaitoli, P.R.; Jansma, E.P.; de Vet, H.C.W.

    2014-01-01

    Background & aims: Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. Methods: A systematic review of

  10. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  11. LEARNING TOOLS INTEROPERABILITY – A NEW STANDARD FOR INTEGRATION OF DISTANCE LEARNING PLATFORMS

    Directory of Open Access Journals (Sweden)

    Oleksandr A. Shcherbyna

    2015-06-01

    Full Text Available For information technology in education there is always an issue of re-usage of electronic educational resources, their transferring possibility from one virtual learning environment to another. Previously, standardized sets of files were used to serve this purpose, for example, SCORM-packages. In this article the new standard Learning Tools Interoperability (LTI is reviewed, which allows users from one environment to access resources from another environment. This makes it possible to integrate them into a single distributed learning environment that is created and shared. The article gives examples of the practical use of standard LTI in Moodle learning management system using External tool and LTI provider plugins.

  12. Development of a multilevel health and safety climate survey tool within a mining setting.

    Science.gov (United States)

    Parker, Anthony W; Tones, Megan J; Ritchie, Gabrielle E

    2017-09-01

    This study aimed to design, implement and evaluate the reliability and validity of a multifactorial and multilevel health and safety climate survey (HSCS) tool with utility in the Australian mining setting. An 84-item questionnaire was developed and pilot tested on a sample of 302 Australian miners across two open cut sites. A 67-item, 10 factor solution was obtained via exploratory factor analysis (EFA) representing prioritization and attitudes to health and safety across multiple domains and organizational levels. Each factor demonstrated a high level of internal reliability, and a series of ANOVAs determined a high level of consistency in responses across the workforce, and generally irrespective of age, experience or job category. Participants tended to hold favorable views of occupational health and safety (OH&S) climate at the management, supervisor, workgroup and individual level. The survey tool demonstrated reliability and validity for use within an open cut Australian mining setting and supports a multilevel, industry specific approach to OH&S climate. Findings suggested a need for mining companies to maintain high OH&S standards to minimize risks to employee health and safety. Future research is required to determine the ability of this measure to predict OH&S outcomes and its utility within other mine settings. As this tool integrates health and safety, it may have benefits for assessment, monitoring and evaluation in the industry, and improving the understanding of how health and safety climate interact at multiple levels to influence OH&S outcomes. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  13. Force feedback facilitates multisensory integration during robotic tool use

    NARCIS (Netherlands)

    Sengül, A.; Rognini, G.; van Elk, M.; Aspell, J.E.; Bleuler, H.; Blanke, O.

    2013-01-01

    The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal

  14. WINS. Market Simulation Tool for Facilitating Wind Energy Integration

    Energy Technology Data Exchange (ETDEWEB)

    Shahidehpour, Mohammad [Illinois Inst. of Technology, Chicago, IL (United States)

    2012-10-30

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practices can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision

  15. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    Science.gov (United States)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  16. Development of data analysis tool for combat system integration

    Directory of Open Access Journals (Sweden)

    Seung-Chun Shin

    2013-03-01

    Full Text Available System integration is an important element for the construction of naval combat ships. In particular, because impeccable combat system integration together with the sensors and weapons can ensure the combat capability and survivability of the ship, the integrated performance of the combat system should be verified and validated whether or not it fulfills the requirements of the end user. In order to conduct systematic verification and validation, a data analysis tool is requisite. This paper suggests the Data Extraction, Recording and Analysis Tool (DERAT for the data analysis of the integrated performance of the combat system, including the functional definition, architecture and effectiveness of the DERAT by presenting the test results.

  17. An Evaluation of the Automated Cost Estimating Integrated Tools (ACEIT) System

    Science.gov (United States)

    1989-09-01

    C~4p DTIC S ELECTE fl JAN12 19 .1R ~OF S%. B -U AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L...Ohio go 91 022 AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Caroline L. Hanson Major, USAF...Department of Defense. AFIT/GCA/LSQ/89S-5 AN EVALUATION OF THE AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) SYSTEM THESIS Presented to the

  18. Integrated Data Visualization and Virtual Reality Tool

    Science.gov (United States)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  19. Indicators and measurement tools for health system integration: a knowledge synthesis protocol.

    Science.gov (United States)

    Oelke, Nelly D; Suter, Esther; da Silva Lima, Maria Alice Dias; Van Vliet-Brown, Cheryl

    2015-07-29

    accessible set of indicators and tools to measure health system integration across different contexts and cultures. Being able to evaluate the success of integration strategies and initiatives will lead to better health system design and improved health outcomes for patients.

  20. THE MANAGEMENT ACCOUNTING TOOLS AND THE INTEGRATED REPORTING

    Directory of Open Access Journals (Sweden)

    Gabriel JINGA

    2015-04-01

    Full Text Available During the recent years the stakeholders are asking for other pieces of information to be published along with the financial one, such as risk reporting, intangibles, social and environmental accounting. The type of corporate reporting which incorporates the elements enumerated above is the integrated reporting. In this article, we argue that the information disclosed in the integrated reports is prepared by the management accounting, not only by the financial accounting. Thus, we search for the management accounting tools which are used by the companies which prepare integrated reports. In order to do this, we analytically reviewed all the reports available on the website of a selected company. Our results show that the company is using most of the management accounting tools mentioned in the literature review part.

  1. Sharing clinical information across care settings: the birth of an integrated assessment system

    Directory of Open Access Journals (Sweden)

    Henrard Jean-Claude

    2009-04-01

    Full Text Available Abstract Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training.

  2. Sharing clinical information across care settings: the birth of an integrated assessment system

    Science.gov (United States)

    Gray, Leonard C; Berg, Katherine; Fries, Brant E; Henrard, Jean-Claude; Hirdes, John P; Steel, Knight; Morris, John N

    2009-01-01

    Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training. PMID:19402891

  3. Laboratory informatics tools integration strategies for drug discovery: integration of LIMS, ELN, CDS, and SDMS.

    Science.gov (United States)

    Machina, Hari K; Wild, David J

    2013-04-01

    There are technologies on the horizon that could dramatically change how informatics organizations design, develop, deliver, and support applications and data infrastructures to deliver maximum value to drug discovery organizations. Effective integration of data and laboratory informatics tools promises the ability of organizations to make better informed decisions about resource allocation during the drug discovery and development process and for more informed decisions to be made with respect to the market opportunity for compounds. We propose in this article a new integration model called ELN-centric laboratory informatics tools integration.

  4. Integrated Decision Tools for Sustainable Watershed/Ground Water and Crop Health using Predictive Weather, Remote Sensing, and Irrigation Decision Tools

    Science.gov (United States)

    Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.

    2017-12-01

    US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.

  5. Computational Design Tools for Integrated Design

    DEFF Research Database (Denmark)

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  6. Nutrition screening tools: does one size fit all? A systematic review of screening tools for the hospital setting.

    Science.gov (United States)

    van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W

    2014-02-01

    Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  7. Effect of different machining processes on the tool surface integrity and fatigue life

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Chuan Liang [College of Mechanical and Electrical Engineering, Nanchang University, Nanchang (China); Zhang, Xianglin [School of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan (China)

    2016-08-15

    Ultra-precision grinding, wire-cut electro discharge machining and lapping are often used to machine the tools in fine blanking industry. And the surface integrity from these machining processes causes great concerns in the research field. To study the effect of processing surface integrity on the fine blanking tool life, the surface integrity of different tool materials under different processing conditions and its influence on fatigue life were thoroughly analyzed in the present study. The result shows that the surface integrity of different materials was quite different on the same processing condition. For the same tool material, the surface integrity on varying processing conditions was quite different too and deeply influenced the fatigue life.

  8. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    Science.gov (United States)

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  9. A validated set of tool pictures with matched objects and non-objects for laterality research.

    Science.gov (United States)

    Verma, Ark; Brysbaert, Marc

    2015-01-01

    Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.

  10. An Integrated Tool for Calculating and Reducing Institution Carbon and Nitrogen Footprints

    Science.gov (United States)

    Galloway, James N.; Castner, Elizabeth A.; Andrews, Jennifer; Leary, Neil; Aber, John D.

    2017-01-01

    Abstract The development of nitrogen footprint tools has allowed a range of entities to calculate and reduce their contribution to nitrogen pollution, but these tools represent just one aspect of environmental pollution. For example, institutions have been calculating their carbon footprints to track and manage their greenhouse gas emissions for over a decade. This article introduces an integrated tool that institutions can use to calculate, track, and manage their nitrogen and carbon footprints together. It presents the methodology for the combined tool, describes several metrics for comparing institution nitrogen and carbon footprint results, and discusses management strategies that reduce both the nitrogen and carbon footprints. The data requirements for the two tools overlap substantially, although integrating the two tools does necessitate the calculation of the carbon footprint of food. Comparison results for five institutions suggest that the institution nitrogen and carbon footprints correlate strongly, especially in the utilities and food sectors. Scenario analyses indicate benefits to both footprints from a range of utilities and food footprint reduction strategies. Integrating these two footprints into a single tool will account for a broader range of environmental impacts, reduce data entry and analysis, and promote integrated management of institutional sustainability. PMID:29350217

  11. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna; Tramontano, Anna; Marcatili, Paolo

    2011-01-01

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  12. A database of immunoglobulins with integrated tools: DIGIT.

    KAUST Repository

    Chailyan, Anna

    2011-11-10

    The DIGIT (Database of ImmunoGlobulins with Integrated Tools) database (http://biocomputing.it/digit) is an integrated resource storing sequences of annotated immunoglobulin variable domains and enriched with tools for searching and analyzing them. The annotations in the database include information on the type of antigen, the respective germline sequences and on pairing information between light and heavy chains. Other annotations, such as the identification of the complementarity determining regions, assignment of their structural class and identification of mutations with respect to the germline, are computed on the fly and can also be obtained for user-submitted sequences. The system allows customized BLAST searches and automatic building of 3D models of the domains to be performed.

  13. miRQuest: integration of tools on a Web server for microRNA research.

    Science.gov (United States)

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-03-28

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.

  14. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  15. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  16. Integrating and scheduling an open set of static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Mezini, Mira; Kloppenburg, Sven

    2006-01-01

    to keep the set of analyses open. We propose an approach to integrating and scheduling an open set of static analyses which decouples the individual analyses and coordinates the analysis executions such that the overall time and space consumption is minimized. The approach has been implemented...... for the Eclipse IDE and has been used to integrate a wide range of analyses such as finding bug patterns, detecting violations of design guidelines, or type system extensions for Java....

  17. Validating the WHO maternal near miss tool: comparing high- and low-resource settings.

    Science.gov (United States)

    Witteveen, Tom; Bezstarosti, Hans; de Koning, Ilona; Nelissen, Ellen; Bloemenkamp, Kitty W; van Roosmalen, Jos; van den Akker, Thomas

    2017-06-19

    WHO proposed the WHO Maternal Near Miss (MNM) tool, classifying women according to several (potentially) life-threatening conditions, to monitor and improve quality of obstetric care. The objective of this study is to analyse merged data of one high- and two low-resource settings where this tool was applied and test whether the tool may be suitable for comparing severe maternal outcome (SMO) between these settings. Using three cohort studies that included SMO cases, during two-year time frames in the Netherlands, Tanzania and Malawi we reassessed all SMO cases (as defined by the original studies) with the WHO MNM tool (five disease-, four intervention- and seven organ dysfunction-based criteria). Main outcome measures were prevalence of MNM criteria and case fatality rates (CFR). A total of 3172 women were studied; 2538 (80.0%) from the Netherlands, 248 (7.8%) from Tanzania and 386 (12.2%) from Malawi. Total SMO detection was 2767 (87.2%) for disease-based criteria, 2504 (78.9%) for intervention-based criteria and 1211 (38.2%) for organ dysfunction-based criteria. Including every woman who received ≥1 unit of blood in low-resource settings as life-threatening, as defined by organ dysfunction criteria, led to more equally distributed populations. In one third of all Dutch and Malawian maternal death cases, organ dysfunction criteria could not be identified from medical records. Applying solely organ dysfunction-based criteria may lead to underreporting of SMO. Therefore, a tool based on defining MNM only upon establishing organ failure is of limited use for comparing settings with varying resources. In low-resource settings, lowering the threshold of transfused units of blood leads to a higher detection rate of MNM. We recommend refined disease-based criteria, accompanied by a limited set of intervention- and organ dysfunction-based criteria to set a measure of severity.

  18. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis.

    Science.gov (United States)

    Lal, Aparna

    2016-02-02

    Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  19. Spatial Modelling Tools to Integrate Public Health and Environmental Science, Illustrated with Infectious Cryptosporidiosis

    Directory of Open Access Journals (Sweden)

    Aparna Lal

    2016-02-01

    Full Text Available Contemporary spatial modelling tools can help examine how environmental exposures such as climate and land use together with socio-economic factors sustain infectious disease transmission in humans. Spatial methods can account for interactions across global and local scales, geographic clustering and continuity of the exposure surface, key characteristics of many environmental influences. Using cryptosporidiosis as an example, this review illustrates how, in resource rich settings, spatial tools have been used to inform targeted intervention strategies and forecast future disease risk with scenarios of environmental change. When used in conjunction with molecular studies, they have helped determine location-specific infection sources and environmental transmission pathways. There is considerable scope for such methods to be used to identify data/infrastructure gaps and establish a baseline of disease burden in resource-limited settings. Spatial methods can help integrate public health and environmental science by identifying the linkages between the physical and socio-economic environment and health outcomes. Understanding the environmental and social context for disease spread is important for assessing the public health implications of projected environmental change.

  20. Development of tools for integrated monitoring and assessment of hazardous substances and their biological effects in the Baltic Sea.

    Science.gov (United States)

    Lehtonen, Kari K; Sundelin, Brita; Lang, Thomas; Strand, Jakob

    2014-02-01

    The need to develop biological effects monitoring to facilitate a reliable assessment of hazardous substances has been emphasized in the Baltic Sea Action Plan of the Helsinki Commission. An integrated chemical-biological approach is vitally important for the understanding and proper assessment of anthropogenic pressures and their effects on the Baltic Sea. Such an approach is also necessary for prudent management aiming at safeguarding the sustainable use of ecosystem goods and Services. The BEAST project (Biological Effects of Anthropogenic Chemical Stress: Tools for the Assessment of Ecosystem Health) set out to address this topic within the BONUS Programme. BEAST generated a large amount of quality-assured data on several biological effects parameters (biomarkers) in various marine species in different sub-regions of the Baltic Sea. New indicators (biological response measurement methods) and management tools (integrated indices) with regard to the integrated monitoring approach were suggested.

  1. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  2. Integrating total quality management in a library setting

    CERN Document Server

    Jurow, Susan

    2013-01-01

    Improve the delivery of library services by implementing total quality management (TQM), a system of continuous improvement employing participative management and centered on the needs of customers. Although TQM was originally designed for and successfully applied in business and manufacturing settings, this groundbreaking volume introduces strategies for translating TQM principles from the profit-based manufacturing sector to the library setting. Integrating Total Quality Management in a Library Setting shows librarians how to improve library services by implementing strategies such as employ

  3. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    Science.gov (United States)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  4. Data integration through brain atlasing: Human Brain Project tools and strategies.

    Science.gov (United States)

    Bjerke, Ingvild E; Øvsthus, Martin; Papp, Eszter A; Yates, Sharon C; Silvestri, Ludovico; Fiorilli, Julien; Pennartz, Cyriel M A; Pavone, Francesco S; Puchades, Maja A; Leergaard, Trygve B; Bjaalie, Jan G

    2018-04-01

    The Human Brain Project (HBP), an EU Flagship Initiative, is currently building an infrastructure that will allow integration of large amounts of heterogeneous neuroscience data. The ultimate goal of the project is to develop a unified multi-level understanding of the brain and its diseases, and beyond this to emulate the computational capabilities of the brain. Reference atlases of the brain are one of the key components in this infrastructure. Based on a new generation of three-dimensional (3D) reference atlases, new solutions for analyzing and integrating brain data are being developed. HBP will build services for spatial query and analysis of brain data comparable to current online services for geospatial data. The services will provide interactive access to a wide range of data types that have information about anatomical location tied to them. The 3D volumetric nature of the brain, however, introduces a new level of complexity that requires a range of tools for making use of and interacting with the atlases. With such new tools, neuroscience research groups will be able to connect their data to atlas space, share their data through online data systems, and search and find other relevant data through the same systems. This new approach partly replaces earlier attempts to organize research data based only on a set of semantic terminologies describing the brain and its subdivisions. Copyright © 2018 The Authors. Published by Elsevier Masson SAS.. All rights reserved.

  5. Developing health science students into integrated health professionals: a practical tool for learning

    Directory of Open Access Journals (Sweden)

    Duncan Madeleine

    2007-11-01

    Full Text Available Abstract Background An integrated sense of professionalism enables health professionals to draw on relevant knowledge in context and to apply a set of professional responsibilities and ethical principles in the midst of changing work environments 12. Inculcating professionalism is therefore a critical goal of health professional education. Two multi-professional courses for first year Health Science students at the University of Cape Town, South Africa aim to lay the foundation for becoming an integrated health professional 3. In these courses a diagram depicting the domains of the integrated health professional is used to focus the content of small group experiential exercises towards an appreciation of professionalism. The diagram serves as an organising framework for conceptualising an emerging professional identity and for directing learning towards the domains of 'self as professional' 45. Objective This paper describes how a diagrammatic representation of the core elements of an integrated health professional is used as a template for framing course content and for organising student learning. Based on the assumption that all health care professionals should be knowledgeable, empathic and reflective, the diagram provides students and educators with a visual tool for investigating the subjective and objective dimensions of professionalism. The use of the diagram as an integrating point of reference for individual and small group learning is described and substantiated with relevant literature. Conclusion The authors have applied the diagram with positive impact for the past six years with students and educators reporting that "it just makes sense". The article includes plans for formal evaluation. Evaluation to date is based on preliminary, informal feedback on the value of the diagram as a tool for capturing the domains of professionalism at an early stage in the undergraduate education of health professional students.

  6. Evaluation of integrated data sets: four examples

    International Nuclear Information System (INIS)

    Bolivar, S.L.; Freeman, S.B.; Weaver, T.A.

    1982-01-01

    Several large data sets have been integrated and utilized for rapid evaluation on a reconnaissance scale for the Montrose 1 0 x 2 0 quadrangle, Colorado. The data sets include Landsat imagery, hydrogeochemical and stream sediment analyses, airborne geophysical data, known mineral occurrences, and a geologic map. All data sets were registered to a 179 x 119 rectangular grid and projected onto Universal Transverse Mercator coordinates. A grid resolution of 1 km was used. All possible combinations of three, for most data sets, were examined for general geologic correlations by utilizing a color microfilm output. In addition, gray-level pictures of statistical output, e.g., factor analysis, have been employed to aid evaluations. Examples for the data sets dysprosium-calcium, lead-copper-zinc, and equivalent uranium-uranium in water-uranium in sediment are described with respect to geologic applications, base-metal regimes, and geochemical associations

  7. Freiburg RNA Tools: a web server integrating INTARNA, EXPARNA and LOCARNA.

    Science.gov (United States)

    Smith, Cameron; Heyne, Steffen; Richter, Andreas S; Will, Sebastian; Backofen, Rolf

    2010-07-01

    The Freiburg RNA tools web server integrates three tools for the advanced analysis of RNA in a common web-based user interface. The tools IntaRNA, ExpaRNA and LocARNA support the prediction of RNA-RNA interaction, exact RNA matching and alignment of RNA, respectively. The Freiburg RNA tools web server and the software packages of the stand-alone tools are freely accessible at http://rna.informatik.uni-freiburg.de.

  8. A new tool for man/machine integration

    International Nuclear Information System (INIS)

    Sommer, W.C.

    1981-01-01

    A popular term within the nuclear power industry today, as a result of TMI, is man/machine interface. It has been determined that greater acknowledgement of this interface is necessary within the industry to integrate the design and operational aspects of a system. What is required is an operational tool that can be used early in the engineering stages of a project and passed on later in time to those who will be responsible to operate that particular system. This paper discusses one such fundamental operations tool that is applied to a process system, its display devices, and its operator actions in a methodical fashion to integrate the machine for man's understanding and proper use. This new tool, referred to as an Operational Schematic, is shown and described. Briefly, it unites, in one location, the important operational display devices with the system process devices. A man can now see the beginning and end of each information and control loop to better understand its function within the system. A method is presented whereby in designing for operability, the schematic is utilized in three phases. The method results in two basic documents, one describes ''what'' is to be operated and the other ''how'' it is to be operated. This integration concept has now considered the hardware spectrum from sensor-to-display and operated the display (on paper) to confirm its operability. Now that the design aspects are complete, the later-in-time operational aspects need to be addressed for the man using the process system. Training personnel in operating and testing the process system is as important as the original design. To accomplish these activities, documents are prepared to instruct personnel how to operate (and test) the system under a variety of circumstances

  9. Intervene: a tool for intersection and visualization of multiple gene or genomic region sets.

    Science.gov (United States)

    Khan, Aziz; Mathelier, Anthony

    2017-05-31

    A common task for scientists relies on comparing lists of genes or genomic regions derived from high-throughput sequencing experiments. While several tools exist to intersect and visualize sets of genes, similar tools dedicated to the visualization of genomic region sets are currently limited. To address this gap, we have developed the Intervene tool, which provides an easy and automated interface for the effective intersection and visualization of genomic region or list sets, thus facilitating their analysis and interpretation. Intervene contains three modules: venn to generate Venn diagrams of up to six sets, upset to generate UpSet plots of multiple sets, and pairwise to compute and visualize intersections of multiple sets as clustered heat maps. Intervene, and its interactive web ShinyApp companion, generate publication-quality figures for the interpretation of genomic region and list sets. Intervene and its web application companion provide an easy command line and an interactive web interface to compute intersections of multiple genomic and list sets. They have the capacity to plot intersections using easy-to-interpret visual approaches. Intervene is developed and designed to meet the needs of both computer scientists and biologists. The source code is freely available at https://bitbucket.org/CBGR/intervene , with the web application available at https://asntech.shinyapps.io/intervene .

  10. Uni- and omnidirectional simulation tools for integrated optics

    NARCIS (Netherlands)

    Stoffer, Remco

    2001-01-01

    This thesis presents several improvements on simulation methods in integrated optics, as well as some new methods. Both uni- and omnidirectional tools are presented; for the unidirectional methods, the emphasis is on higher-order accuracy; for the omnidirectional methods, the boundary conditions are

  11. MiniWall Tool for Analyzing CFD and Wind Tunnel Large Data Sets

    Science.gov (United States)

    Schuh, Michael J.; Melton, John E.; Stremel, Paul M.

    2017-01-01

    It is challenging to review and assimilate large data sets created by Computational Fluid Dynamics (CFD) simulations and wind tunnel tests. Over the past 10 years, NASA Ames Research Center has developed and refined a software tool dubbed the MiniWall to increase productivity in reviewing and understanding large CFD-generated data sets. Under the recent NASA ERA project, the application of the tool expanded to enable rapid comparison of experimental and computational data. The MiniWall software is browser based so that it runs on any computer or device that can display a web page. It can also be used remotely and securely by using web server software such as the Apache HTTP server. The MiniWall software has recently been rewritten and enhanced to make it even easier for analysts to review large data sets and extract knowledge and understanding from these data sets. This paper describes the MiniWall software and demonstrates how the different features are used to review and assimilate large data sets.

  12. Decoding the genome with an integrative analysis tool: combinatorial CRM Decoder.

    Science.gov (United States)

    Kang, Keunsoo; Kim, Joomyeong; Chung, Jae Hoon; Lee, Daeyoup

    2011-09-01

    The identification of genome-wide cis-regulatory modules (CRMs) and characterization of their associated epigenetic features are fundamental steps toward the understanding of gene regulatory networks. Although integrative analysis of available genome-wide information can provide new biological insights, the lack of novel methodologies has become a major bottleneck. Here, we present a comprehensive analysis tool called combinatorial CRM decoder (CCD), which utilizes the publicly available information to identify and characterize genome-wide CRMs in a species of interest. CCD first defines a set of the epigenetic features which is significantly associated with a set of known CRMs as a code called 'trace code', and subsequently uses the trace code to pinpoint putative CRMs throughout the genome. Using 61 genome-wide data sets obtained from 17 independent mouse studies, CCD successfully catalogued ∼12 600 CRMs (five distinct classes) including polycomb repressive complex 2 target sites as well as imprinting control regions. Interestingly, we discovered that ∼4% of the identified CRMs belong to at least two different classes named 'multi-functional CRM', suggesting their functional importance for regulating spatiotemporal gene expression. From these examples, we show that CCD can be applied to any potential genome-wide datasets and therefore will shed light on unveiling genome-wide CRMs in various species.

  13. Risk Informed Design Using Integrated Vehicle Rapid Assessment Tools

    Data.gov (United States)

    National Aeronautics and Space Administration — A successful proof of concept was performed in FY 2012 integrating the Envision tool for parametric estimates of vehicle mass and the Rapid Response Risk Assessment...

  14. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  15. From psychotherapy to e-therapy: the integration of traditional techniques and new communication tools in clinical settings.

    Science.gov (United States)

    Castelnuovo, Gianluca; Gaggioli, Andrea; Mantovani, Fabrizia; Riva, Giuseppe

    2003-08-01

    Technology is starting to influence psychological fields. In particular, computer-mediated communication (CMC) is providing new tools that can be fruitfully applied in psychotherapy. These new technologies do not substitute for traditional techniques and approaches but they could be used as integration in the clinical process, enhancing or making easier particular steps of it. This paper focuses on the concept of e-therapy as a new modality of helping people resolve life and relationship issues. It utilizes the power and convenience of the Internet to allow synchronous and asynchronous communication between patient and therapist. It is important to underline that e-therapy is not an alternative treatment, but a resource that can be added to traditional psychotherapy. The paper also discusses how different forms of CMC can be fruitfully applied in psychology and psychotherapy, by evaluating the effectiveness of them in the clinical practice. To enhance the diffusion of e-therapy, further research is needed to evaluate all the pros and cons.

  16. Tools and approaches for simplifying serious games development in educational settings

    OpenAIRE

    Calvo, Antonio; Rotaru, Dan C.; Freire, Manuel; Fernandez-Manjon, Baltasar

    2016-01-01

    Serious Games can benefit from the commercial video games industry by taking advantage of current development tools. However, the economics and requirements of serious games and commercial games are very different. In this paper, we describe the factors that impact the total cost of ownership of serious games used in educational settings, review the specific requirements of games used as learning material, and analyze the different development tools available in the industry highlighting thei...

  17. Goal setting: an integral component of effective diabetes care.

    Science.gov (United States)

    Miller, Carla K; Bauman, Jennifer

    2014-08-01

    Goal setting is a widely used behavior change tool in diabetes education and training. Prior research found specific relatively difficult but attainable goals set within a specific timeframe improved performance in sports and at the workplace. However, the impact of goal setting in diabetes self-care has not received extensive attention. This review examined the mechanisms underlying behavioral change according to goal setting theory and evaluated the impact of goal setting in diabetes intervention studies. Eight studies were identified, which incorporated goal setting as the primary strategy to promote behavioral change in individual, group-based, and primary care settings among patients with type 2 diabetes. Improvements in diabetes-related self-efficacy, dietary intake, physical activity, and A1c were observed in some but not all studies. More systematic research is needed to determine the conditions and behaviors for which goal setting is most effective. Initial recommendations for using goal setting in diabetes patient encounters are offered.

  18. An Integrated Development Tool for a safety application using FBD language

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young Jun; Lee, Jang Soo; Lee, Dong Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-05-15

    Regarding digitalizing the Nuclear Instrumentation and Control Systems, the application program responsible for the safety functions of Nuclear I and C Systems shall ensure the robustness of the safety function through development, testing, and validation roles for a life cycle process during software development. The importance of software in nuclear systems increases continuously. The integrated engineering tools to develop, test, and validate safety application programs require increasingly more complex parts among a number of components within nuclear digital I and C systems. This paper introduces the integrated engineering tool (SafeCASE-PLC) developed by our project. The SafeCASE-PLC is a kind of software engineering tool to develop, test, and validate the nuclear application program performed in an automatic controller

  19. Taming the data wilderness with the VHO: Integrating heliospheric data sets

    Science.gov (United States)

    Schroeder, P.; Szabo, A.; Narock, T.

    Currently space physicists are faced with a bewildering array of heliospheric missions experiments and data sets available at archives distributed around the world Daunting even for those most familiar with the field physicists in other concentrations solar physics magnetospheric physics etc find locating the heliospheric data that they need extremely challenging if not impossible The Virtual Heliospheric Observatory VHO will help to solve this problem by creating an Application Programming Interface API and web portal that integrates these data sets to find the highest quality data for a given task The VHO will locate the best available data often found only at PI institutions rather than at national archives like the NSSDC The VHO will therefore facilitate a dynamic data environment where improved data products are made available immediately In order to accomplish this the VHO will enforce a metadata standard on participating data providers with sufficient depth to allow for meaningful scientific evaluation of similar data products The VHO will provide an automated way for secondary sites to keep mirrors of data archives up to date and encouraging the generation of secondary or added-value data products The VHO will interact seamlessly with the Virtual Solar Observatory VSO and other Virtual Observatories VxO s to allow for inter-disciplinary data searching Software tools for these data sets will also be available through the VHO Finally the VHO will provide linkages to the modeling community and will develop metadata standards for the

  20. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  1. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  2. Tools for integrated sequence-structure analysis with UCSF Chimera

    Directory of Open Access Journals (Sweden)

    Huang Conrad C

    2006-07-01

    Full Text Available Abstract Background Comparing related structures and viewing the structures in the context of sequence alignments are important tasks in protein structure-function research. While many programs exist for individual aspects of such work, there is a need for interactive visualization tools that: (a provide a deep integration of sequence and structure, far beyond mapping where a sequence region falls in the structure and vice versa; (b facilitate changing data of one type based on the other (for example, using only sequence-conserved residues to match structures, or adjusting a sequence alignment based on spatial fit; (c can be used with a researcher's own data, including arbitrary sequence alignments and annotations, closely or distantly related sets of proteins, etc.; and (d interoperate with each other and with a full complement of molecular graphics features. We describe enhancements to UCSF Chimera to achieve these goals. Results The molecular graphics program UCSF Chimera includes a suite of tools for interactive analyses of sequences and structures. Structures automatically associate with sequences in imported alignments, allowing many kinds of crosstalk. A novel method is provided to superimpose structures in the absence of a pre-existing sequence alignment. The method uses both sequence and secondary structure, and can match even structures with very low sequence identity. Another tool constructs structure-based sequence alignments from superpositions of two or more proteins. Chimera is designed to be extensible, and mechanisms for incorporating user-specific data without Chimera code development are also provided. Conclusion The tools described here apply to many problems involving comparison and analysis of protein structures and their sequences. Chimera includes complete documentation and is intended for use by a wide range of scientists, not just those in the computational disciplines. UCSF Chimera is free for non-commercial use and is

  3. Critical chain project management and drum-buffer-rope tools integration in construction industry - case study

    Directory of Open Access Journals (Sweden)

    Piotr Cyplik

    2012-03-01

    Full Text Available Background: The concept of integrating the theory of constraints tools in reorganizing management system in a mechanical engineering company was presented in this article. The main aim of the concept is to enable the enterprise to satisfy the customers' expectations at reasonable costs, which allows for making a profit and creating an agile enterprise in the long run. Methods: Due to the individual character of the production process and service process in analyzed company, the described concept using theory of constraints project management (CCPM and manufacturing (DBR tools. The authors use performance levels conception to build an integration tool focused on the interaction and collaboration between different departments. The integration tool has been developed and verified in Polish manufacturing company. Results: In described model a tool compatible with CCPM operates on the level of the customer service process. Shop floor is controlled based on the DBR method. The authors hold that the integration of between TOC tools is of key importance. The integration of TOC tools dedicated to managing customer service and shop floor scheduling and controlling requires developing a mechanism for repeated transmitting the information between them. This mechanism has been developed. Conclusions: The conducted research showed that the developed tool integrating CCPM and DBR had a positive impact on the enterprise performance. It enables improving the company performance in meeting target group requirements by focusing on enhancing the efficiency of processes running in the company and tasks processed at particular work stations. The described model has been successfully implemented in one of the Polish mechanical engineering companies.

  4. Cancer survival classification using integrated data sets and intermediate information.

    Science.gov (United States)

    Kim, Shinuk; Park, Taesung; Kon, Mark

    2014-09-01

    Although numerous studies related to cancer survival have been published, increasing the prediction accuracy of survival classes still remains a challenge. Integration of different data sets, such as microRNA (miRNA) and mRNA, might increase the accuracy of survival class prediction. Therefore, we suggested a machine learning (ML) approach to integrate different data sets, and developed a novel method based on feature selection with Cox proportional hazard regression model (FSCOX) to improve the prediction of cancer survival time. FSCOX provides us with intermediate survival information, which is usually discarded when separating survival into 2 groups (short- and long-term), and allows us to perform survival analysis. We used an ML-based protocol for feature selection, integrating information from miRNA and mRNA expression profiles at the feature level. To predict survival phenotypes, we used the following classifiers, first, existing ML methods, support vector machine (SVM) and random forest (RF), second, a new median-based classifier using FSCOX (FSCOX_median), and third, an SVM classifier using FSCOX (FSCOX_SVM). We compared these methods using 3 types of cancer tissue data sets: (i) miRNA expression, (ii) mRNA expression, and (iii) combined miRNA and mRNA expression. The latter data set included features selected either from the combined miRNA/mRNA profile or independently from miRNAs and mRNAs profiles (IFS). In the ovarian data set, the accuracy of survival classification using the combined miRNA/mRNA profiles with IFS was 75% using RF, 86.36% using SVM, 84.09% using FSCOX_median, and 88.64% using FSCOX_SVM with a balanced 22 short-term and 22 long-term survivor data set. These accuracies are higher than those using miRNA alone (70.45%, RF; 75%, SVM; 75%, FSCOX_median; and 75%, FSCOX_SVM) or mRNA alone (65.91%, RF; 63.64%, SVM; 72.73%, FSCOX_median; and 70.45%, FSCOX_SVM). Similarly in the glioblastoma multiforme data, the accuracy of miRNA/mRNA using IFS

  5. QFD: a methodological tool for integration of ergonomics at the design stage.

    Science.gov (United States)

    Marsot, Jacques

    2005-03-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute launched in 1999 a research program on the topic of integrating ergonomics into hand tool design. After a brief review of the problems of integrating ergonomics at the design stage, the paper shows how the "Quality Function Deployment" method has been applied to the design of a boning knife and it highlights the difficulties encountered. Then, it demonstrates how this method can be a methodological tool geared to greater ergonomics consideration in product design.

  6. A Nonparametric, Multiple Imputation-Based Method for the Retrospective Integration of Data Sets

    Science.gov (United States)

    Carrig, Madeline M.; Manrique-Vallier, Daniel; Ranby, Krista W.; Reiter, Jerome P.; Hoyle, Rick H.

    2015-01-01

    Complex research questions often cannot be addressed adequately with a single data set. One sensible alternative to the high cost and effort associated with the creation of large new data sets is to combine existing data sets containing variables related to the constructs of interest. The goal of the present research was to develop a flexible, broadly applicable approach to the integration of disparate data sets that is based on nonparametric multiple imputation and the collection of data from a convenient, de novo calibration sample. We demonstrate proof of concept for the approach by integrating three existing data sets containing items related to the extent of problematic alcohol use and associations with deviant peers. We discuss both necessary conditions for the approach to work well and potential strengths and weaknesses of the method compared to other data set integration approaches. PMID:26257437

  7. The integration of FMEA with other problem solving tools: A review of enhancement opportunities

    Science.gov (United States)

    Ng, W. C.; Teh, S. Y.; Low, H. C.; Teoh, P. C.

    2017-09-01

    Failure Mode Effect Analysis (FMEA) is one the most effective and accepted problem solving (PS) tools for most of the companies in the world. Since FMEA was first introduced in 1949, practitioners have implemented FMEA in various industries for their quality improvement initiatives. However, studies have shown that there are drawbacks that hinder the effectiveness of FMEA for continuous quality improvement from product design to manufacturing. Therefore, FMEA is integrated with other PS tools such as inventive problem solving methodology (TRIZ), Quality Function Deployment (QFD), Root Cause Analysis (RCA) and seven basic tools of quality to address the drawbacks. This study begins by identifying the drawbacks in FMEA. A comprehensive literature review on the integration of FMEA with other tools is carried out to categorise the integrations based on the drawbacks identified. The three categories are inefficiency of failure analysis, psychological inertia and neglect of customers’ perspective. This study concludes by discussing the gaps and opportunities in the integration for future research.

  8. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  9. Advertising Can Be an Effective Integrated Marketing Tool

    Science.gov (United States)

    Lauer, Larry D.

    2007-01-01

    Advertising will not undermine the critical thinking of consumers when it is combined with other communication media, and when it is truthful. In fact, it can provide clarity about the competitive advantage of individual institutions and aid an individual's ability to choose wisely. Advertising is just one of the tools in the integrated marketing…

  10. Useful tools for non-linear systems: Several non-linear integral inequalities

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mohammadpour, A.; Mesiar, Radko; Vaezpour, M. S.

    2013-01-01

    Roč. 49, č. 1 (2013), s. 73-80 ISSN 0950-7051 R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : Monotone measure * Comonotone functions * Integral inequalities * Universal integral Subject RIV: BA - General Mathematics Impact factor: 3.058, year: 2013 http://library.utia.cas.cz/separaty/2013/E/mesiar-useful tools for non-linear systems several non-linear integral inequalities.pdf

  11. SECIMTools: a suite of metabolomics data analysis tools.

    Science.gov (United States)

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  12. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  13. A Python tool to set up relative free energy calculations in GROMACS.

    Science.gov (United States)

    Klimovich, Pavel V; Mobley, David L

    2015-11-01

    Free energy calculations based on molecular dynamics (MD) simulations have seen a tremendous growth in the last decade. However, it is still difficult and tedious to set them up in an automated manner, as the majority of the present-day MD simulation packages lack that functionality. Relative free energy calculations are a particular challenge for several reasons, including the problem of finding a common substructure and mapping the transformation to be applied. Here we present a tool, alchemical-setup.py, that automatically generates all the input files needed to perform relative solvation and binding free energy calculations with the MD package GROMACS. When combined with Lead Optimization Mapper (LOMAP; Liu et al. in J Comput Aided Mol Des 27(9):755-770, 2013), recently developed in our group, alchemical-setup.py allows fully automated setup of relative free energy calculations in GROMACS. Taking a graph of the planned calculations and a mapping, both computed by LOMAP, our tool generates the topology and coordinate files needed to perform relative free energy calculations for a given set of molecules, and provides a set of simulation input parameters. The tool was validated by performing relative hydration free energy calculations for a handful of molecules from the SAMPL4 challenge (Mobley et al. in J Comput Aided Mol Des 28(4):135-150, 2014). Good agreement with previously published results and the straightforward way in which free energy calculations can be conducted make alchemical-setup.py a promising tool for automated setup of relative solvation and binding free energy calculations.

  14. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  15. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    National Research Council Canada - National Science Library

    Petre, P; Visher, J; Shringarpure, R; Valley, F; Swaminathan, M

    2005-01-01

    Automated design tools and integrated design flow methodologies were developed that demonstrated more than an order- of-magnitude reduction in cycle time and cost for mixed signal (digital/analoglRF...

  16. On set-valued functionals: Multivariate risk measures and Aumann integrals

    Science.gov (United States)

    Ararat, Cagin

    In this dissertation, multivariate risk measures for random vectors and Aumann integrals of set-valued functions are studied. Both are set-valued functionals with values in a complete lattice of subsets of Rm. Multivariate risk measures are considered in a general d-asset financial market with trading opportunities in discrete time. Specifically, the following features of the market are incorporated in the evaluation of multivariate risk: convex transaction costs modeled by solvency regions, intermediate trading constraints modeled by convex random sets, and the requirement of liquidation into the first m ≤ d of the assets. It is assumed that the investor has a "pure" multivariate risk measure R on the space of m-dimensional random vectors which represents her risk attitude towards the assets but does not take into account the frictions of the market. Then, the investor with a d-dimensional position minimizes the set-valued functional R over all m-dimensional positions that she can reach by trading in the market subject to the frictions described above. The resulting functional Rmar on the space of d-dimensional random vectors is another multivariate risk measure, called the market-extension of R. A dual representation for R mar that decomposes the effects of R and the frictions of the market is proved. Next, multivariate risk measures are studied in a utility-based framework. It is assumed that the investor has a complete risk preference towards each individual asset, which can be represented by a von Neumann-Morgenstern utility function. Then, an incomplete preference is considered for multivariate positions which is represented by the vector of the individual utility functions. Under this structure, multivariate shortfall and divergence risk measures are defined as the optimal values of set minimization problems. The dual relationship between the two classes of multivariate risk measures is constructed via a recent Lagrange duality for set optimization. In

  17. Metadata and Tools for Integration and Preservation of Cultural Heritage 3D Information

    Directory of Open Access Journals (Sweden)

    Achille Felicetti

    2011-12-01

    Full Text Available In this paper we investigate many of the various storage, portability and interoperability issues arising among archaeologists and cultural heritage people when dealing with 3D technologies. On the one side, the available digital repositories look often unable to guarantee affordable features in the management of 3D models and their metadata; on the other side the nature of most of the available data format for 3D encoding seem to be not satisfactory for the necessary portability required nowadays by 3D information across different systems. We propose a set of possible solutions to show how integration can be achieved through the use of well known and wide accepted standards for data encoding and data storage. Using a set of 3D models acquired during various archaeological campaigns and a number of open source tools, we have implemented a straightforward encoding process to generate meaningful semantic data and metadata. We will also present the interoperability process carried out to integrate the encoded 3D models and the geographic features produced by the archaeologists. Finally we will report the preliminary (rather encouraging development of a semantic enabled and persistent digital repository, where 3D models (but also any kind of digital data and metadata can easily be stored, retrieved and shared with the content of other digital archives.

  18. Integrated Variable-Fidelity Tool Set for Modeling and Simulation of Aeroservothermoelasticity-Propulsion (ASTE-P) Effects for Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  19. Integrated Variable-Fidelity Tool Set For Modeling and Simulation of Aeroservothermoelasticity -Propulsion (ASTE-P) Effects For Aerospace Vehicles Ranging From Subsonic to Hypersonic Flight, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program aims at developing a variable-fidelity software tool set for aeroservothermoelastic-propulsive (ASTE-P) modeling that can be routinely...

  20. Evaluating online diagnostic decision support tools for the clinical setting.

    Science.gov (United States)

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  1. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  2. Integration: valuing stakeholder input in setting priorities for socially sustainable egg production.

    Science.gov (United States)

    Swanson, J C; Lee, Y; Thompson, P B; Bawden, R; Mench, J A

    2011-09-01

    Setting directions and goals for animal production systems requires the integration of information achieved through internal and external processes. The importance of stakeholder input in setting goals for sustainable animal production systems should not be overlooked by the agricultural animal industries. Stakeholders play an integral role in setting the course for many aspects of animal production, from influencing consumer preferences to setting public policy. The Socially Sustainable Egg Production Project (SSEP) involved the development of white papers on various aspects of egg production, followed by a stakeholder workshop to help frame the issues for the future of sustainable egg production. Representatives from the environmental, food safety, food retail, consumer, animal welfare, and the general farm and egg production sectors participated with members of the SSEP coordination team in a 1.5-d workshop to explore socially sustainable egg production. This paper reviews the published literature on values integration methodologies and the lessons learned from animal welfare assessment models. The integration method used for the SSEP stakeholder workshop and its outcome are then summarized. The method used for the SSEP stakeholder workshop can be used to obtain stakeholder input on sustainable production in other farm animal industries.

  3. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering

    2012-06-30

    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  4. A review of computer tools for analysing the integration of renewable energy into various energy systems

    DEFF Research Database (Denmark)

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad

    2010-01-01

    to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...... of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...

  5. Geoinformation Systems as a Tool of the Integrated Tourist Spaces Management

    Directory of Open Access Journals (Sweden)

    Kolesnikovich Victor

    2014-09-01

    Full Text Available Introduction. Currently tourist activity management is in need of creating special conditions for the development of integrated management tools based on the general information and analytical base. Material and methods. The creation of architecture and the content of geoinformation and hybrid information systems are oriented at the usage of the Integrated Tourist Spaces Management (ITSM to set up a specific claim related to the features of management model. The authors created the concept of tourist space. The information and the analytical system are used to create the information model of tourist space. Information support development of ITSM system is a sort of a hybrid system: an expert system constructed on the basis of GIS. Results and conclusions. By means of GIS collecting, storage, analysis and graphic visualization of spatial data and the related information on the objects presented in an expert system is provided. The offered approach leads to the formation of an information system and the analytical maintenance of not only human decision-making, but it also promotes the creation of new tourist products based on more and more differentiated inquiries of clients or a ratio of the price and quality (from the point of view of satisfaction of inquiries.

  6. Integration of Web 2.0 Tools in Learning a Programming Course

    Science.gov (United States)

    Majid, Nazatul Aini Abd

    2014-01-01

    Web 2.0 tools are expected to assist students to acquire knowledge effectively in their university environment. However, the lack of effort from lecturers in planning the learning process can make it difficult for the students to optimize their learning experiences. The aim of this paper is to integrate Web 2.0 tools with learning strategy in…

  7. Social Work Student and Practitioner Roles in Integrated Care Settings.

    Science.gov (United States)

    Fraher, Erin P; Richman, Erica Lynn; Zerden, Lisa de Saxe; Lombardi, Brianna

    2018-06-01

    Social workers are increasingly being deployed in integrated medical and behavioral healthcare settings but information about the roles they fill in these settings is not well understood. This study sought to identify the functions that social workers perform in integrated settings and identify where they acquired the necessary skills to perform them. Master of social work students (n=21) and their field supervisors (n=21) who were part of a Health Resources and Services Administration-funded program to train and expand the behavioral health workforce in integrated settings were asked how often they engaged in 28 functions, where they learned to perform those functions, and the degree to which their roles overlapped with others on the healthcare team. The most frequent functions included employing cultural competency, documenting in the electronic health record, addressing patient social determinants of health, and participating in team-based care. Respondents were least likely to engage in case conferences; use Screening, Brief Intervention and Referral to Treatment; use stepped care to determine necessary level of treatment; conduct functional assessments of daily living skills; use behavioral activation; and use problem-solving therapy. A total of 80% of respondents reported that their roles occasionally, often, very often, or always overlapped with others on the healthcare team. Students reported learning the majority of skills (76%) in their Master of Social Work programs. Supervisors attributed the majority (65%) of their skill development to on-the-job training. Study findings suggest the need to redesign education, regulatory, and payment to better support the deployment of social workers in integrated care settings. This article is part of a supplement entitled The Behavioral Health Workforce: Planning, Practice, and Preparation, which is sponsored by the Substance Abuse and Mental Health Services Administration and the Health Resources and Services

  8. Integration between a sales support system and a simulation tool

    OpenAIRE

    Wahlström, Ola

    2005-01-01

    InstantPlanner is a sales support system for the material handling industry, visualizing and calculating designs faster and more correctly than other tools on the market. AutoMod is a world leading simulation tool used in the material handling industry to optimize and calculate appropriate configuration designs. Both applications are favorable in their own area provide a great platform for integration with the properties of fast designing, correct product calculations, great simulation capabi...

  9. The integrable case of Adler-van Moerbeke. Discriminant set and bifurcation diagram

    Science.gov (United States)

    Ryabov, Pavel E.; Oshemkov, Andrej A.; Sokolov, Sergei V.

    2016-09-01

    The Adler-van Moerbeke integrable case of the Euler equations on the Lie algebra so(4) is investigated. For the L- A pair found by Reyman and Semenov-Tian-Shansky for this system, we explicitly present a spectral curve and construct the corresponding discriminant set. The singularities of the Adler-van Moerbeke integrable case and its bifurcation diagram are discussed. We explicitly describe singular points of rank 0, determine their types, and show that the momentum mapping takes them to self-intersection points of the real part of the discriminant set. In particular, the described structure of singularities of the Adler-van Moerbeke integrable case shows that it is topologically different from the other known integrable cases on so(4).

  10. Knowledge Management tools integration within DLR's concurrent engineering facility

    Science.gov (United States)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  11. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Darlea, G L; Dumitru, I; Scannicchio, DA; Twomey, M S; Valsan, M L; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 PCs which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2 which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  12. Upgrade and integration of the configuration and monitoring tools for the ATLAS Online farm

    International Nuclear Information System (INIS)

    Ballestrero, S; Darlea, G–L; Twomey, M S; Brasolin, F; Dumitru, I; Valsan, M L; Scannicchio, D A; Zaytsev, A

    2012-01-01

    The ATLAS Online farm is a non-homogeneous cluster of nearly 3000 systems which run the data acquisition, trigger and control of the ATLAS detector. The systems are configured and monitored by a combination of open-source tools, such as Quattor and Nagios, and tools developed in-house, such as ConfDB. We report on the ongoing introduction of new provisioning and configuration tools, Puppet and ConfDB v2, which are more flexible and allow automation for previously uncovered needs, and on the upgrade and integration of the monitoring and alerting tools, including the interfacing of these with the TDAQ Shifter Assistant software and their integration with configuration tools. We discuss the selection of the tools and the assessment of their functionality and performance, and how they enabled the introduction of virtualization for selected services.

  13. Integrated Network Analysis and Effective Tools in Plant Systems Biology

    Directory of Open Access Journals (Sweden)

    Atsushi eFukushima

    2014-11-01

    Full Text Available One of the ultimate goals in plant systems biology is to elucidate the genotype-phenotype relationship in plant cellular systems. Integrated network analysis that combines omics data with mathematical models has received particular attention. Here we focus on the latest cutting-edge computational advances that facilitate their combination. We highlight (1 network visualization tools, (2 pathway analyses, (3 genome-scale metabolic reconstruction, and (4 the integration of high-throughput experimental data and mathematical models. Multi-omics data that contain the genome, transcriptome, proteome, and metabolome and mathematical models are expected to integrate and expand our knowledge of complex plant metabolisms.

  14. Data, models, and views: towards integration of diverse numerical model components and data sets for scientific and public dissemination

    Science.gov (United States)

    Hofmeister, Richard; Lemmen, Carsten; Nasermoaddeli, Hassan; Klingbeil, Knut; Wirtz, Kai

    2015-04-01

    Data and models for describing coastal systems span a diversity of disciplines, communities, ecosystems, regions and techniques. Previous attempts of unifying data exchange, coupling interfaces, or metadata information have not been successful. We introduce the new Modular System for Shelves and Coasts (MOSSCO, http://www.mossco.de), a novel coupling framework that enables the integration of a diverse array of models and data from different disciplines relating to coastal research. In the MOSSCO concept, the integrating framework imposes very few restrictions on contributed data or models; in fact, there is no distinction made between data and models. The few requirements are: (1) principle coupleability, i.e. access to I/O and timing information in submodels, which has recently been referred to as the Basic Model Interface (BMI) (2) open source/open data access and licencing and (3) communication of metadata, such as spatiotemporal information, naming conventions, and physical units. These requirements suffice to integrate different models and data sets into the MOSSCO infrastructure and subsequently built a modular integrated modeling tool that can span a diversity of processes and domains. We demonstrate how diverse coastal system constituents were integrated into this modular framework and how we deal with the diverging development of constituent data sets and models at external institutions. Finally, we show results from simulations with the fully coupled system using OGC WebServices in the WiMo geoportal (http://kofserver3.hzg.de/wimo), from where stakeholders can view the simulation results for further dissemination.

  15. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  16. Teaching Students How to Integrate and Assess Social Networking Tools in Marketing Communications

    Science.gov (United States)

    Schlee, Regina Pefanis; Harich, Katrin R.

    2013-01-01

    This research is based on two studies that focus on teaching students how to integrate and assess social networking tools in marketing communications. Study 1 examines how students in marketing classes utilize social networking tools and explores their attitudes regarding the use of such tools for marketing communications. Study 2 focuses on an…

  17. Indico central - events organisation, ergonomics and collaboration tools integration

    International Nuclear Information System (INIS)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  18. Indico central - events organisation, ergonomics and collaboration tools integration

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez Lopez, Jose Benito; Ferreira, Jose Pedro; Baron, Thomas, E-mail: jose.benito.gonzalez@cern.c, E-mail: jose.pedro.ferreira@cern.c, E-mail: thomas.baron@cern.c [CERN IT-UDS-AVC, 1211 Geneve 23 (Switzerland)

    2010-04-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  19. Indico Central - Events Organisation, Ergonomics and Collaboration Tools Integration

    CERN Document Server

    Gonzalez Lopez, J B; Baron, T; CERN. Geneva. IT Department

    2010-01-01

    While the remote collaboration services at CERN slowly aggregate around the Indico event management software, its new version which is the result of a careful maturation process includes improvements which will set a new reference in its domain. The presentation will focus on the description of the new features of the tool, the user feedback process which resulted in a new record of usability. We will also describe the interactions with the worldwide community of users and server administrators and the impact this has had on our development process, as well as the tools set in place to streamline the work between the different collaborating sites. A last part will be dedicated to the use of Indico as a central hub for operating other local services around the event organisation (registration epayment, audiovisual recording, webcast, room booking, and videoconference support)

  20. Six sigma tools in integrating internal operations of a retail pharmacy: a case study.

    Science.gov (United States)

    Kumar, Sameer; Kwong, Anthony M

    2011-01-01

    This study was initiated to integrate information and enterprise-wide healthcare delivery system issues specifically within an inpatient retail pharmacy operation in a U.S. community hospital. Six Sigma tools were used to examine the effects to an inpatient retail pharmacy service process. Some of the tools used include service blueprints, cause-effect diagram, gap analysis derived from customer and employee surveys, mistake proofing was applied in various business situations and results were analyzed to identify and propose process improvements and integration. The research indicates that the Six Sigma tools in this discussion are very applicable and quite effective in helping to streamline and integrate the pharmacy process flow. Additionally, gap analysis derived from two different surveys was used to estimate the primary areas of focus to increase customer and employee satisfaction. The results of this analysis were useful in initiating discussions of how to effectively narrow these service gaps. This retail pharmaceutical service study serves as a framework for the process that should occur for successful process improvement tool evaluation and implementation. Pharmaceutical Service operations in the U.S. that use this integration framework must tailor it to their individual situations to maximize their chances for success.

  1. Integrated environmental decision support tool based on GIS technology

    International Nuclear Information System (INIS)

    Doctor, P.G.; O'Neil, T.K.; Sackschewsky, M.R.; Becker, J.M.; Rykiel, E.J.; Walters, T.B.; Brandt, C.A.; Hall, J.A.

    1995-01-01

    Environmental restoration and management decisions facing the US Department of Energy require balancing trade-offs between diverse land uses and impacts over multiple spatial and temporal scales. Many types of environmental data have been collected for the Hanford Site and the Columbia River in Washington State over the past fifty years. Pacific Northwest National Laboratory (PNNL) is integrating these data into a Geographic Information System (GIS) based computer decision support tool. This tool provides a comprehensive and concise description of the current environmental landscape that can be used to evaluate the ecological and monetary trade-offs between future land use, restoration and remediation options before action is taken. Ecological impacts evaluated include effects to individual species of concern and habitat loss and fragmentation. Monetary impacts include those associated with habitat mitigation. The tool is organized as both a browsing tool for educational purposes, and as a framework that leads a project manager through the steps needed to be in compliance with environmental requirements

  2. epsilon : A tool to find a canonical basis of master integrals

    Science.gov (United States)

    Prausa, Mario

    2017-10-01

    In 2013, Henn proposed a special basis for a certain class of master integrals, which are expressible in terms of iterated integrals. In this basis, the master integrals obey a differential equation, where the right hand side is proportional to ɛ in d = 4 - 2 ɛ space-time dimensions. An algorithmic approach to find such a basis was found by Lee. We present the tool epsilon, an efficient implementation of Lee's algorithm based on the Fermat computer algebra system as computational back end.

  3. Am I getting an accurate picture: a tool to assess clinical handover in remote settings?

    Directory of Open Access Journals (Sweden)

    Malcolm Moore

    2017-11-01

    Full Text Available Abstract Background Good clinical handover is critical to safe medical care. Little research has investigated handover in rural settings. In a remote setting where nurses and medical students give telephone handover to an aeromedical retrieval service, we developed a tool by which the receiving clinician might assess the handover; and investigated factors impacting on the reliability and validity of that assessment. Methods Researchers consulted with clinicians to develop an assessment tool, based on the ISBAR handover framework, combining validity evidence and the existing literature. The tool was applied ‘live’ by receiving clinicians and from recorded handovers by academic assessors. The tool’s performance was analysed using generalisability theory. Receiving clinicians and assessors provided feedback. Results Reliability for assessing a call was good (G = 0.73 with 4 assessments. The scale had a single factor structure with good internal consistency (Cronbach’s alpha = 0.8. The group mean for the global score for nurses and students was 2.30 (SD 0.85 out of a maximum 3.0, with no difference between these sub-groups. Conclusions We have developed and evaluated a tool to assess high-stakes handover in a remote setting. It showed good reliability and was easy for working clinicians to use. Further investigation and use is warranted beyond this setting.

  4. Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration

    Science.gov (United States)

    Hariri, Ali

    Solar photovoltaic (PV) energy integration in distribution networks is one of the fastest growing sectors of distributed energy integration. The growth in solar PV integration is incentivized by various clean power policies, global interest in solar energy, and reduction in manufacturing and installation costs of solar energy systems. The increase in solar PV integration has raised a number of concerns regarding the potential impacts that might arise as a result of high PV penetration. Some impacts have already been recorded in networks with high PV penetration such as in China, Germany, and USA (Hawaii and California). Therefore, network planning is becoming more intricate as new technologies are integrated into the existing electric grid. The integrated new technologies pose certain compatibility concerns regarding the existing electric grid infrastructure. Therefore, PV integration impact studies are becoming more essential in order to have a better understanding of how to advance the solar PV integration efforts without introducing adverse impacts into the network. PV impact studies are important for understanding the nature of the new introduced phenomena. Understanding the nature of the potential impacts is a key factor for mitigating and accommodating for said impacts. Traditionally, electric power utilities relied on phasor-based power flow simulations for planning their electric networks. However, the conventional, commercially available, phasor-based simulation tools do not provide proper visibility across a wide spectrum of electric phenomena. Moreover, different types of simulation approaches are suitable for specific types of studies. For instance, power flow software cannot be used for studying time varying phenomena. At the same time, it is not practical to use electromagnetic transient (EMT) tools to perform power flow solutions. Therefore, some electric phenomena caused by the variability of PV generation are not visible using the conventional

  5. West-Life, Tools for Integrative Structural Biology

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Structural biology is part of molecular biology focusing on determining structure of macromolecules inside living cells and cell membranes. As macromolecules determines most of the functions of cells the structural knowledge is very useful for further research in metabolism, physiology to application in pharmacology etc. As macromolecules are too small to be observed directly by light microscope, there are other methods used to determine the structure including nuclear magnetic resonance (NMR), X-Ray crystalography, cryo electron microscopy and others. Each method has it's advantages and disadvantages in the terms of availability, sample preparation, resolution. West-Life project has ambition to facilitate integrative approach using multiple techniques mentioned above. As there are already lot of software tools to process data produced by the techniques above, the challenge is to integrate them together in a way they can be used by experts in one technique but not experts in other techniques. One product ...

  6. On the Integration of Digital Design and Analysis Tools

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2006-01-01

    The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and design...

  7. Integrated Reporting as a Tool for Communicating with Stakeholders - Advantages and Disadvantages

    Science.gov (United States)

    Matuszyk, Iwona; Rymkiewicz, Bartosz

    2018-03-01

    Financial and non-financial reporting from the beginning of its existence is the primary source of communication between the company and a wide range of stakeholders. Over the decades it has adapted to the needs of rapidly changing business and social environment. Currently, the final link in the evolution of organizational reporting, such as integrated reporting, assumes integration and mutual connectivity to both financial and non-financial data. The main interest in the concept of integrated reporting comes from the value it contributes to the organization. Undoubtedly, the concept of integrated reporting is a milestone in the evolution of organizational reporting. It is however important to consider whether it adequately addresses the information needs of a wide range of stakeholders, and whether it is a universal tool for communication between the company and its stakeholders. The aim of the paper is to discuss the advantages and disadvantages of the concept of integrated reporting as a tool for communication with stakeholders and to further directions of its development. The article uses the research methods such as literature analysis, the content analysis of the corporate publications and comparative analysis.

  8. Instructor's Perceptions towards the Use of an Online Instructional Tool in an Academic English Setting in Kuwait

    Science.gov (United States)

    Erguvan, Deniz

    2014-01-01

    This study sets out to explore the faculty members' perceptions of a specific web-based instruction tool (Achieve3000) in a private higher education institute in Kuwait. The online tool provides highly differentiated instruction, which is initiated with a level set at the beginning of the term. The program is used in two consecutive courses as…

  9. Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets

    Science.gov (United States)

    Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.

    On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.

  10. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    Science.gov (United States)

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  11. Using registries to integrate bioinformatics tools and services into workbench environments

    DEFF Research Database (Denmark)

    Ménager, Hervé; Kalaš, Matúš; Rapacki, Kristoffer

    2016-01-01

    The diversity and complexity of bioinformatics resources presents significant challenges to their localisation, deployment and use, creating a need for reliable systems that address these issues. Meanwhile, users demand increasingly usable and integrated ways to access and analyse data, especially......, a software component that will ease the integration of bioinformatics resources in a workbench environment, using their description provided by the existing ELIXIR Tools and Data Services Registry....

  12. Identification and Management of Eating Disorders in Integrated Primary Care: Recommendations for Psychologists in Integrated Care Settings.

    Science.gov (United States)

    Buchholz, Laura J; King, Paul R; Wray, Laura O

    2017-06-01

    Eating disorders are associated with deleterious health consequences, increased risk of mortality, and psychosocial impairment. Although individuals with eating disorders are likely to seek treatment in general medical settings such as primary care (PC), these conditions are often under-detected by PC providers. However, psychologists in integrated PC settings are likely to see patients with eating disorders because of the mental health comorbidities associated with these conditions. Further, due to their training in identifying risk factors associated with eating disorders (i.e., comorbid mental health and medical disorders) and opportunities for collaboration with PC providers, psychologists are well-positioned to improve the detection and management of eating disorders in PC. This paper provides a brief overview of eating disorders and practical guidance for psychologists working in integrated PC settings to facilitate the identification and management of these conditions.

  13. SEPHYDRO: An Integrated Multi-Filter Web-Based Tool for Baseflow Separation

    Science.gov (United States)

    Serban, D.; MacQuarrie, K. T. B.; Popa, A.

    2017-12-01

    Knowledge of baseflow contributions to streamflow is important for understanding watershed scale hydrology, including groundwater-surface water interactions, impact of geology and landforms on baseflow, estimation of groundwater recharge rates, etc. Baseflow (or hydrograph) separation methods can be used as supporting tools in many areas of environmental research, such as the assessment of the impact of agricultural practices, urbanization and climate change on surface water and groundwater. Over the past few decades various digital filtering and graphically-based methods have been developed in an attempt to improve the assessment of the dynamics of the various sources of streamflow (e.g. groundwater, surface runoff, subsurface flow); however, these methods are not available under an integrated platform and, individually, often require significant effort for implementation. Here we introduce SEPHYDRO, an open access, customizable web-based tool, which integrates 11 algorithms allowing for separation of streamflow hydrographs. The streamlined interface incorporates a reference guide as well as additional information that allows users to import their own data, customize the algorithms, and compare, visualise and export results. The tool includes one-, two- and three-parameter digital filters as well as graphical separation methods and has been successfully applied in Atlantic Canada, in studies dealing with nutrient loading to fresh water and coastal water ecosystems. Future developments include integration of additional separation algorithms as well as incorporation of geochemical separation methods. SEPHYDRO has been developed through a collaborative research effort between the Canadian Rivers Institute, University of New Brunswick (Fredericton, New Brunswick, Canada), Agriculture and Agri-Food Canada and Environment and Climate Change Canada and is currently available at http://canadianriversinstitute.com/tool/

  14. A set of tools for determining the LAT performance in specific applications

    International Nuclear Information System (INIS)

    Lott, B.; Ballet, J.; Chiang, J.; Lonjou, V.; Funk, S.

    2007-01-01

    The poster presents a set of simple tools being developed to predict GLAST's performance for specific cases, like the accumulation time needed to reach a given significance or statistical accuracy for a particular source. Different examples are given, like the generation of a full-sky sensitivity map

  15. Follow up: Compound data sets and software tools for chemoinformatics and medicinal chemistry applications: update and data transfer

    Science.gov (United States)

    Hu, Ye; Bajorath, Jürgen

    2014-01-01

    In 2012, we reported 30 compound data sets and/or programs developed in our laboratory in a data article and made them freely available to the scientific community to support chemoinformatics and computational medicinal chemistry applications. These data sets and computational tools were provided for download from our website. Since publication of this data article, we have generated 13 new data sets with which we further extend our collection of publicly available data and tools. Due to changes in web servers and website architectures, data accessibility has recently been limited at times. Therefore, we have also transferred our data sets and tools to a public repository to ensure full and stable accessibility. To aid in data selection, we have classified the data sets according to scientific subject areas. Herein, we describe new data sets, introduce the data organization scheme, summarize the database content and provide detailed access information in ZENODO (doi: 10.5281/zenodo.8451 and doi:10.5281/zenodo.8455). PMID:25520777

  16. Older adult mistreatment risk screening: contribution to the validation of a screening tool in a domestic setting.

    Science.gov (United States)

    Lindenbach, Jeannette M; Larocque, Sylvie; Lavoie, Anne-Marise; Garceau, Marie-Luce

    2012-06-01

    ABSTRACTThe hidden nature of older adult mistreatment renders its detection in the domestic setting particularly challenging. A validated screening instrument that can provide a systematic assessment of risk factors can facilitate this detection. One such instrument, the "expanded Indicators of Abuse" tool, has been previously validated in the Hebrew language in a hospital setting. The present study has contributed to the validation of the "e-IOA" in an English-speaking community setting in Ontario, Canada. It consisted of two phases: (a) a content validity review and adaptation of the instrument by experts throughout Ontario, and (b) an inter-rater reliability assessment by home visiting nurses. The adaptation, the "Mistreatment of Older Adult Risk Factors" tool, offers a comprehensive tool for screening in the home setting. This instrument is significant to professional practice as practitioners working with older adults will be better equipped to assess for risk of mistreatment.

  17. Integration of distributed system simulation tools for a holistic approach to integrated building and system design

    NARCIS (Netherlands)

    Radosevic, M.; Hensen, J.L.M.; Wijsman, A.J.T.M.; Hensen, J.L.M.; Lain, M.

    2004-01-01

    Advanced architectural developments require an integrated approach to design where simulation tools available today deal. only with a small subset of the overall problem. The aim of this study is to enable run time exchange of necessary data at suitable frequency between different simulation

  18. Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation Electricore, Inc.

    Energy Technology Data Exchange (ETDEWEB)

    Daye, Tony [Green Power Labs (GPL), San Diego, CA (United States)

    2013-09-30

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  19. Hypogeal geological survey in the "Grotta del Re Tiberio" natural cave (Apennines, Italy): a valid tool for reconstructing the structural setting

    Science.gov (United States)

    Ghiselli, Alice; Merazzi, Marzio; Strini, Andrea; Margutti, Roberto; Mercuriali, Michele

    2011-06-01

    As karst systems are natural windows to the underground, speleology, combined with geological surveys, can be useful tools for helping understand the geological evolution of karst areas. In order to enhance the reconstruction of the structural setting in a gypsum karst area (Vena del Gesso, Romagna Apennines), a detailed analysis has been carried out on hypogeal data. Structural features (faults, fractures, tectonic foliations, bedding) have been mapped in the "Grotta del Re Tiberio" cave, in the nearby gypsum quarry tunnels and open pit benches. Five fracture systems and six fault systems have been identified. The fault systems have been further analyzed through stereographic projections and geometric-kinematic evaluations in order to reconstruct the relative chronology of these structures. This analysis led to the detection of two deformation phases. The results permitted linking of the hypogeal data with the surface data both at a local and regional scale. At the local scale, fracture data collected in the underground have been compared with previous authors' surface data coming from the quarry area. The two data sets show a very good correspondence, as every underground fracture system matches with one of the surface fracture system. Moreover, in the cave, a larger number of fractures belonging to each system could be mapped. At the regional scale, the two deformation phases detected can be integrated in the structural setting of the study area, thereby enhancing the tectonic interpretation of the area ( e.g., structures belonging to a new deformation phase, not reported before, have been identified underground). The structural detailed hypogeal survey has, thus, provided very useful data, both by integrating the existing information and revealing new data not detected at the surface. In particular, some small structures ( e.g., displacement markers and short fractures) are better preserved in the hypogeal environment than on the surface where the outcropping

  20. Set-valued and fuzzy stochastic integral equations driven by semimartingales under Osgood condition

    Directory of Open Access Journals (Sweden)

    Malinowski Marek T.

    2015-01-01

    Full Text Available We analyze the set-valued stochastic integral equations driven by continuous semimartingales and prove the existence and uniqueness of solutions to such equations in the framework of the hyperspace of nonempty, bounded, convex and closed subsets of the Hilbert space L2 (consisting of square integrable random vectors. The coefficients of the equations are assumed to satisfy the Osgood type condition that is a generalization of the Lipschitz condition. Continuous dependence of solutions with respect to data of the equation is also presented. We consider equations driven by semimartingale Z and equations driven by processes A;M from decomposition of Z, where A is a process of finite variation and M is a local martingale. These equations are not equivalent. Finally, we show that the analysis of the set-valued stochastic integral equations can be extended to a case of fuzzy stochastic integral equations driven by semimartingales under Osgood type condition. To obtain our results we use the set-valued and fuzzy Maruyama type approximations and Bihari’s inequality.

  1. Tools of integration of innovation-oriented machine-building enterprises in industrial park environment

    Directory of Open Access Journals (Sweden)

    К.О. Boiarynova

    2017-08-01

    Full Text Available The research is devoted to the development of the tools for the integration of innovation-oriented mechanical engineering enterprises into the environment of industrial park as functional economic systems, which are capable on the own development basis to provide the development of resident enterprises. The article analyzes the opportunities for the development of mechanical engineering enterprises. The formed structure of the mechanism of integration of mechanical engineering enterprises as functional economic systems into the industrial park environment is based on: 1 the development of participation programs in the industrial park of the mechanical engineering enterprises as an innovation-oriented partner, which foresees the development of the enterprise immediately and the development of other residents; 2 the provision of high-tech equipment of resident enterprises of industrial parks; 3 the creation of subsidiary-spin-out enterprises of large mechanical engineering enterprises for high-tech production in the industrial park. The author proposes the road map that reveals the procedures for the integration and functioning the investigated enterprises through interaction as well as in the ecosystem of the industrial park and in the general ecosystem of functioning, and the tools for providing economic functionality through economic and organizational proceedings at preventive, partner and resident phases of integration. The tools allow the innovation-oriented mechanical engineering enterprises to integrate into such territorial structures as industrial parks, this in complex will allow carrying out their purposes in the development of the real sector of the economy.

  2. Automated prototyping tool-kit (APT)

    OpenAIRE

    Nada, Nader; Shing, M.; Berzins, V.; Luqi

    2002-01-01

    Automated prototyping tool-kit (APT) is an integrated set of software tools that generate source programs directly from real-time requirements. The APT system uses a fifth-generation prototyping language to model the communication structure, timing constraints, 1/0 control, and data buffering that comprise the requirements for an embedded software system. The language supports the specification of hard real-time systems with reusable components from domain specific component libraries. APT ha...

  3. Biological data integration: wrapping data and tools.

    Science.gov (United States)

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  4. Processing: A Python Framework for the Seamless Integration of Geoprocessing Tools in QGIS

    Directory of Open Access Journals (Sweden)

    Anita Graser

    2015-10-01

    Full Text Available Processing is an object-oriented Python framework for the popular open source Geographic Information System QGIS, which provides a seamless integration of geoprocessing tools from a variety of different software libraries. In this paper, we present the development history, software architecture and features of the Processing framework, which make it a versatile tool for the development of geoprocessing algorithms and workflows, as well as an efficient integration platform for algorithms from different sources. Using real-world application examples, we furthermore illustrate how the Processing architecture enables typical geoprocessing use cases in research and development, such as automating and documenting workflows, combining algorithms from different software libraries, as well as developing and integrating custom algorithms. Finally, we discuss how Processing can facilitate reproducible research and provide an outlook towards future development goals.

  5. Improving beam set-up using an online beam optics tool

    International Nuclear Information System (INIS)

    Richter, S.; Barth, W.; Franczak, B.; Scheeler, U.; Wilms, D.

    2004-01-01

    The GSI accelerator facility [1] consists of the Universal Linear Accelerator (Unilac), the heavy ion synchrotron SIS, and the Experimental Storage Ring (ESR). Two Unilac injectors with three ion source terminals provide ion species from the lightest such as hydrogen up to uranium. The High Current Injector (HSI) for low charge state ion beams provides mostly high intense but short pulses, whereas the High Charge State Injector (HLI) supplies long pulses with a high duty factor of up to 27%. Before entering the Alvarez section of the Unilac the ion beam from the HSI is stripped in a supersonic gas jet. Up to three different ion species can be accelerated for up to five experiments in a time-sharing mode. Frequent changes of beam energy and intensity during a single beam time period may result in time consuming set-up and tuning especially of the beam transport lines. To shorten these changeover times an online optics tool (MIRKO EXPERT) had been developed. Based on online emittance measurements at well-defined locations the beam envelopes are calculated using the actual magnet settings. With this input improved calculated magnet settings can be directly sent to the magnet power supplies. The program reads profile grid measurements, such that an atomized beam alignment is established and that steering times are minimized. Experiences on this tool will be reported. At the Unilac a special focus is put on high current operation with short but intense beam pulses. Limitations like missing non-destructive beam diagnostics, insufficient longitudinal beam diagnostics, insufficient longitudinal beam matching, and influence of the hard edged model for magnetic fields will be discussed. Special attention will be put on the limits due to high current effects with bunched beams. (author)

  6. setsApp: Set operations for Cytoscape Nodes and Edges [v1; ref status: indexed, http://f1000r.es/3ml

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2014-07-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  7. Mathematical tools for data mining set theory, partial orders, combinatorics

    CERN Document Server

    Simovici, Dan A

    2014-01-01

    Data mining essentially relies on several mathematical disciplines, many of which are presented in this second edition of this book. Topics include partially ordered sets, combinatorics, general topology, metric spaces, linear spaces, graph theory. To motivate the reader a significant number of applications of these mathematical tools are included ranging from association rules, clustering algorithms, classification, data constraints, logical data analysis, etc. The book is intended as a reference for researchers and graduate students. The current edition is a significant expansion of the firs

  8. The Webinar Integration Tool: A Framework for Promoting Active Learning in Blended Environments

    Science.gov (United States)

    Lieser, Ping; Taf, Steven D.; Murphy-Hagan, Anne

    2018-01-01

    This paper describes a three-stage process of developing a webinar integration tool to enhance the interaction of teaching and learning in blended environments. In the context of medical education, we emphasize three factors of effective webinar integration in blended learning: fostering better solutions for faculty and students to interact…

  9. Using Plickers as an Assessment Tool in Health and Physical Education Settings

    Science.gov (United States)

    Chng, Lena; Gurvitch, Rachel

    2018-01-01

    Written tests are one of the most common assessment tools classroom teachers use today. Despite its popularity, administering written tests or surveys, especially in health and physical education settings, is time consuming. In addition to the time taken to type and print out the tests or surveys, health and physical education teachers must grade…

  10. Application of fuzzy set theory for integral assessment of agricultural products quality

    Science.gov (United States)

    Derkanosova, N. M.; Ponomareva, I. N.; Shurshikova, G. V.; Vasilenko, O. A.

    2018-05-01

    The methodology of integrated assessment of quality and safety of agricultural products, approbated by the example of indicators of wheat grain in relation to the provision of consumer properties of bakery products, was developed. Determination of the level of quality of the raw ingredients will allow direct using of agricultural raw materials for food production, taking into account ongoing technology, types of products, and, respectively, rational use of resource potential of the agricultural sector. The mathematical tool of the proposed method is a fuzzy set theory. The fuzzy classifier to evaluate the properties of the grain is formed. The set of six indicators normalized by the national standard is determined; values are ordered and represented by linguistic variables with a trapeziform membership function; the rules for calculation of membership functions are presented. Specific criteria values for individual indicators in shaping the quality of the finished products are considered. For one of the samples of wheat grain values of membership; functions of the linguistic variable "level" for all indicators and the linguistic variable "level of quality" were calculated. It is established that the studied sample of grain obtains the 2 (average) level of quality. Accordingly, it can be recommended for the production of bakery products with higher requirements for the structural-mechanical properties bakery and puff pastry products hearth bread and flour confectionery products of the group of hard dough cookies and crackers

  11. Zebrafish Expression Ontology of Gene Sets (ZEOGS): A Tool to Analyze Enrichment of Zebrafish Anatomical Terms in Large Gene Sets

    Science.gov (United States)

    Marsico, Annalisa

    2013-01-01

    Abstract The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene

  12. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    Science.gov (United States)

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  13. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  14. Engineering a mobile health tool for resource-poor settings to assess and manage cardiovascular disease risk: SMARThealth study.

    Science.gov (United States)

    Raghu, Arvind; Praveen, Devarsetty; Peiris, David; Tarassenko, Lionel; Clifford, Gari

    2015-04-29

    The incidence of chronic diseases in low- and middle-income countries is rapidly increasing both in urban and rural regions. A major challenge for health systems globally is to develop innovative solutions for the prevention and control of these diseases. This paper discusses the development and pilot testing of SMARTHealth, a mobile-based, point-of-care Clinical Decision Support (CDS) tool to assess and manage cardiovascular disease (CVD) risk in resource-constrained settings. Through pilot testing, the preliminary acceptability, utility, and efficiency of the CDS tool was obtained. The CDS tool was part of an mHealth system comprising a mobile application that consisted of an evidence-based risk prediction and management algorithm, and a server-side electronic medical record system. Through an agile development process and user-centred design approach, key features of the mobile application that fitted the requirements of the end users and environment were obtained. A comprehensive analytics framework facilitated a data-driven approach to investigate four areas, namely, system efficiency, end-user variability, manual data entry errors, and usefulness of point-of-care management recommendations to the healthcare worker. A four-point Likert scale was used at the end of every risk assessment to gauge ease-of-use of the system. The system was field-tested with eleven village healthcare workers and three Primary Health Centre doctors, who screened a total of 292 adults aged 40 years and above. 34% of participants screened by health workers were identified by the CDS tool to be high CVD risk and referred to a doctor. In-depth analysis of user interactions found the CDS tool feasible for use and easily integrable into the workflow of healthcare workers. Following completion of the pilot, further technical enhancements were implemented to improve uptake of the mHealth platform. It will then be evaluated for effectiveness and cost-effectiveness in a cluster randomized

  15. Workplace wellness using online learning tools in a healthcare setting.

    Science.gov (United States)

    Blake, Holly; Gartshore, Emily

    2016-09-01

    The aim was to develop and evaluate an online learning tool for use with UK healthcare employees, healthcare educators and healthcare students, to increase knowledge of workplace wellness as an important public health issue. A 'Workplace Wellness' e-learning tool was developed and peer-reviewed by 14 topic experts. This focused on six key areas relating to workplace wellness: work-related stress, musculoskeletal disorders, diet and nutrition, physical activity, smoking and alcohol consumption. Each key area provided current evidence-based information on causes and consequences, access to UK government reports and national statistics, and guidance on actions that could be taken to improve health within a workplace setting. 188 users (93.1% female, age 18-60) completed online knowledge questionnaires before (n = 188) and after (n = 88) exposure to the online learning tool. Baseline knowledge of workplace wellness was poor (n = 188; mean accuracy 47.6%, s.d. 11.94). Knowledge significantly improved from baseline to post-intervention (mean accuracy = 77.5%, s.d. 13.71) (t(75) = -14.801, p online learning, indicating scope for development of further online packages relating to other important health parameters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies

    Directory of Open Access Journals (Sweden)

    Monika Raulinajtys-Grzybek

    2017-09-01

    Full Text Available Practical use of the integrated reporting framework – an analysis of the content of integrated reports of selected companies The purpose of the article is to provide a research tool for an initial assessment of whether a company’s integrated reports meet the objectives set out in the IIRC Integrated Reporting Framework and its empirical verification. In particular, the research addresses whether the reports meet the goal of improving the quality of information available and covering all factors that influence the organization’s ability to create value. The article uses the theoretical output on the principles of preparing integrated reports and analyzes the content of selected integrated reports. Based on the source analysis, a research tool has been developed for an initial assessment of whether an integrated report fulfills its objectives. It consists of 42 questions that verify the coverage of the defined elements and the implementation of the guiding principles set by the IIRC. For empirical verification of the tool, a comparative analysis was carried out for reports prepared by selected companies operating in the utilities sector. Answering questions from the research tool allows a researcher to formulate conclusions about the implementation of the guiding principles and the completeness of the presentation of the content elements. As a result of the analysis of selected integrated reports, it was stated that various elements of the report are presented with different levels of accuracy in different reports. Reports provide the most complete information on performance and strategy. The information about business model and prospective data is in some cases presented without making a link to other parts of the report – e.g. risks and opportunities, financial data or capitals. The absence of such links limits the ability to claim that an integrated report meets its objectives, since a set of individual reports, each presenting

  17. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  18. Validation of the TRUST tool in a Greek perioperative setting.

    Science.gov (United States)

    Chatzea, Vasiliki-Eirini; Sifaki-Pistolla, Dimitra; Dey, Nilanjan; Melidoniotis, Evangelos

    2017-06-01

    The aim of this study was to translate, culturally adapt and validate the TRUST questionnaire in a Greek perioperative setting. The TRUST questionnaire assesses the relationship between trust and performance. The study assessed the levels of trust and performance in the surgery and anaesthesiology department during a very stressful period for Greece (economic crisis) and offered a user friendly and robust assessment tool. The study concludes that the Greek version of the TRUST questionnaire is a reliable and valid instrument for measuring team performance among Greek perioperative teams. Copyright the Association for Perioperative Practice.

  19. Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.

    Science.gov (United States)

    Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A

    2013-01-01

    Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.

  20. Laughter Filled the Classroom: Outcomes of Professional Development in Arts Integration for Elementary Teachers in Inclusion Settings

    Science.gov (United States)

    Koch, Katherine A.; Thompson, Janna Chevon

    2017-01-01

    This qualitative study examined teachers' experiences with an arts integration curriculum. This study considered the teachers' perceptions of arts integrations before and after being introduced to the concepts of arts integration. The teachers were provided with knowledge and tools to integrate the arts into general education curriculum and…

  1. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  2. KAIKObase: An integrated silkworm genome database and data mining tool

    Directory of Open Access Journals (Sweden)

    Nagaraju Javaregowda

    2009-10-01

    Full Text Available Abstract Background The silkworm, Bombyx mori, is one of the most economically important insects in many developing countries owing to its large-scale cultivation for silk production. With the development of genomic and biotechnological tools, B. mori has also become an important bioreactor for production of various recombinant proteins of biomedical interest. In 2004, two genome sequencing projects for B. mori were reported independently by Chinese and Japanese teams; however, the datasets were insufficient for building long genomic scaffolds which are essential for unambiguous annotation of the genome. Now, both the datasets have been merged and assembled through a joint collaboration between the two groups. Description Integration of the two data sets of silkworm whole-genome-shotgun sequencing by the Japanese and Chinese groups together with newly obtained fosmid- and BAC-end sequences produced the best continuity (~3.7 Mb in N50 scaffold size among the sequenced insect genomes and provided a high degree of nucleotide coverage (88% of all 28 chromosomes. In addition, a physical map of BAC contigs constructed by fingerprinting BAC clones and a SNP linkage map constructed using BAC-end sequences were available. In parallel, proteomic data from two-dimensional polyacrylamide gel electrophoresis in various tissues and developmental stages were compiled into a silkworm proteome database. Finally, a Bombyx trap database was constructed for documenting insertion positions and expression data of transposon insertion lines. Conclusion For efficient usage of genome information for functional studies, genomic sequences, physical and genetic map information and EST data were compiled into KAIKObase, an integrated silkworm genome database which consists of 4 map viewers, a gene viewer, and sequence, keyword and position search systems to display results and data at the level of nucleotide sequence, gene, scaffold and chromosome. Integration of the

  3. HIFSuite: Tools for HDL Code Conversion and Manipulation

    Directory of Open Access Journals (Sweden)

    Bombieri Nicola

    2010-01-01

    Full Text Available Abstract HIFSuite ia a set of tools and application programming interfaces (APIs that provide support for modeling and verification of HW/SW systems. The core of HIFSuite is the HDL Intermediate Format (HIF language upon which a set of front-end and back-end tools have been developed to allow the conversion of HDL code into HIF code and vice versa. HIFSuite allows designers to manipulate and integrate heterogeneous components implemented by using different hardware description languages (HDLs. Moreover, HIFSuite includes tools, which rely on HIF APIs, for manipulating HIF descriptions in order to support code abstraction/refinement and postrefinement verification.

  4. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Science.gov (United States)

    Alves, Mara L.; Brites, Cláudia; Paulo, Manuel; Carbas, Bruna; Belo, Maria; Mendes-Moreira, Pedro M. R.; Brites, Carla; Bronze, Maria do Rosário; Gunjača, Jerko; Šatović, Zlatko; Vaz Patto, Maria C.

    2017-01-01

    Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber), flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds). These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI) model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds) could still be found. Regarding the agronomic performance, farmers' maize populations

  5. Setting Up Decision-Making Tools toward a Quality-Oriented Participatory Maize Breeding Program

    Directory of Open Access Journals (Sweden)

    Mara L. Alves

    2017-12-01

    Full Text Available Previous studies have reported promising differences in the quality of kernels from farmers' maize populations collected in a Portuguese region known to produce maize-based bread. However, several limitations have been identified in the previous characterizations of those populations, such as a limited set of quality traits accessed and a missing accurate agronomic performance evaluation. The objectives of this study were to perform a more detailed quality characterization of Portuguese farmers' maize populations; to estimate their agronomic performance in a broader range of environments; and to integrate quality, agronomic, and molecular data in the setting up of decision-making tools for the establishment of a quality-oriented participatory maize breeding program. Sixteen farmers' maize populations, together with 10 other maize populations chosen for comparison purposes, were multiplied in a common-garden experiment for quality evaluation. Flour obtained from each population was used to study kernel composition (protein, fat, fiber, flour's pasting behavior, and bioactive compound levels (carotenoids, tocopherols, phenolic compounds. These maize populations were evaluated for grain yield and ear weight in nine locations across Portugal; the populations' adaptability and stability were evaluated using additive main effects and multiplication interaction (AMMI model analysis. The phenotypic characterization of each population was complemented with a molecular characterization, in which 30 individuals per population were genotyped with 20 microsatellites. Almost all farmers' populations were clustered into the same quality-group characterized by high levels of protein and fiber, low levels of carotenoids, volatile aldehydes, α- and δ-tocopherols, and breakdown viscosity. Within this quality-group, variability on particular quality traits (color and some bioactive compounds could still be found. Regarding the agronomic performance, farmers

  6. Impact of electronic medical record integration of a handoff tool on sign-out in a newborn intensive care unit

    Science.gov (United States)

    Palma, JP; Sharek, PJ; Longhurst, CA

    2016-01-01

    Objective To evaluate the impact of integrating a handoff tool into the electronic medical record (EMR) on sign-out accuracy, satisfaction and workflow in a neonatal intensive care unit (NICU). Study Design Prospective surveys of neonatal care providers in an academic children’s hospital 1 month before and 6 months following EMR integration of a standalone Microsoft Access neonatal handoff tool. Result Providers perceived sign-out information to be somewhat or very accurate at a rate of 78% with the standalone handoff tool and 91% with the EMR-integrated tool (P < 0.01). Before integration of neonatal sign-out into the EMR, 35% of providers were satisfied with the process of updating sign-out information and 71% were satisfied with the printed sign-out document; following EMR integration, 92% of providers were satisfied with the process of updating sign-out information (P < 0.01) and 98% were satisfied with the printed sign-out document (P < 0.01). Neonatal care providers reported spending a median of 11 to 15 min/day updating the standalone sign-out and 16 to 20 min/day updating the EMR-integrated sign-out (P = 0.026). The median percentage of total sign-out preparation time dedicated to transcribing information from the EMR was 25 to 49% before and <25% after EMR integration of the handoff tool (P < 0.01). Conclusion Integration of a NICU-specific handoff tool into an EMR resulted in improvements in perceived sign-out accuracy, provider satisfaction and at least one aspect of workflow. PMID:21273990

  7. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  8. An Integrated Approach of Fuzzy Linguistic Preference Based AHP and Fuzzy COPRAS for Machine Tool Evaluation.

    Directory of Open Access Journals (Sweden)

    Huu-Tho Nguyen

    Full Text Available Globalization of business and competitiveness in manufacturing has forced companies to improve their manufacturing facilities to respond to market requirements. Machine tool evaluation involves an essential decision using imprecise and vague information, and plays a major role to improve the productivity and flexibility in manufacturing. The aim of this study is to present an integrated approach for decision-making in machine tool selection. This paper is focused on the integration of a consistent fuzzy AHP (Analytic Hierarchy Process and a fuzzy COmplex PRoportional ASsessment (COPRAS for multi-attribute decision-making in selecting the most suitable machine tool. In this method, the fuzzy linguistic reference relation is integrated into AHP to handle the imprecise and vague information, and to simplify the data collection for the pair-wise comparison matrix of the AHP which determines the weights of attributes. The output of the fuzzy AHP is imported into the fuzzy COPRAS method for ranking alternatives through the closeness coefficient. Presentation of the proposed model application is provided by a numerical example based on the collection of data by questionnaire and from the literature. The results highlight the integration of the improved fuzzy AHP and the fuzzy COPRAS as a precise tool and provide effective multi-attribute decision-making for evaluating the machine tool in the uncertain environment.

  9. INTEGRATING CORPUS-BASED RESOURCES AND NATURAL LANGUAGE PROCESSING TOOLS INTO CALL

    Directory of Open Access Journals (Sweden)

    Pascual Cantos Gomez

    2002-06-01

    Full Text Available This paper ainis at presenting a survey of computational linguistic tools presently available but whose potential has been neither fully considered not exploited to its full in modern CALL. It starts with a discussion on the rationale of DDL to language learning, presenting typical DDL-activities. DDL-software and potential extensions of non-typical DDL-software (electronic dictionaries and electronic dictionary facilities to DDL . An extended section is devoted to describe NLP-technology and how it can be integrated into CALL, within already existing software or as stand alone resources. A range of NLP-tools is presentcd (MT programs, taggers, lemn~atizersp, arsers and speech technologies with special emphasis on tagged concordancing. The paper finishes with a number of reflections and ideas on how language technologies can be used efficiently within the language learning context and how extensive exploration and integration of these technologies might change and extend both modern CAI,I, and the present language learning paradigiii..

  10. Integrating declarative knowledge programming styles and tools for building expert systems

    Energy Technology Data Exchange (ETDEWEB)

    Barbuceanu, M; Trausan-Matu, S; Molnar, B

    1987-01-01

    The XRL system reported in this paper is an integrated knowledge programming environment whose major research theme is the investigation of declarative knowledge programming styles and features and of the way they can be effectively integrated and used to support AI programming. This investigation is carried out in the context of the structured-object representation paradigm which provides the glue keeping XRL components together. The paper describes several declarative programming styles and associated support tools available in XRL. These include an instantiation system supporting a generalized view of the ubiquous frame installation process, a description based programming system providing a novel declarative programming style which embeds a mathematical oriented description language in the structured object environment and a transformational interpreter for using it, a semantics oriented programming framework which offers a specific semantic construct based approach supporting maintenance and evolution and a self description and self generation tool which applies the latter approach to XRL itself. 29 refs., 16 figs.

  11. GAPIT: genome association and prediction integrated tool.

    Science.gov (United States)

    Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu

    2012-09-15

    Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.

  12. An integrated risk assessment tool for team-based periodontal disease management.

    Science.gov (United States)

    Thyvalikakath, Thankam P; Padman, Rema; Gupta, Sugandh

    2013-01-01

    Mounting evidence suggests a potential association of periodontal disease with systemic diseases such as diabetes, cardiovascular disease, cancer and stroke. The objective of this study is to develop an integrated risk assessment tool that displays a patients' risk for periodontal disease in the context of their systemic disease, social habits and oral health. Such a tool will be used by not just dental professionals but also by care providers who participate in the team-based care for chronic disease management. Displaying relationships between risk factors and its influence on the patient's general health could be a powerful educational and disease management tool for patients and clinicians. It may also improve the coordination of care provided by the provider-members of a chronic care team.

  13. Calculation of Coupled Vibroacoustics Response Estimates from a Library of Available Uncoupled Transfer Function Sets

    Science.gov (United States)

    Smith, Andrew; LaVerde, Bruce; Hunt, Ron; Fulcher, Clay; Towner, Robert; McDonald, Emmett

    2012-01-01

    The design and theoretical basis of a new database tool that quickly generates vibroacoustic response estimates using a library of transfer functions (TFs) is discussed. During the early stages of a launch vehicle development program, these response estimates can be used to provide vibration environment specification to hardware vendors. The tool accesses TFs from a database, combines the TFs, and multiplies these by input excitations to estimate vibration responses. The database is populated with two sets of uncoupled TFs; the first set representing vibration response of a bare panel, designated as H(sup s), and the second set representing the response of the free-free component equipment by itself, designated as H(sup c). For a particular configuration undergoing analysis, the appropriate H(sup s) and H(sup c) are selected and coupled to generate an integrated TF, designated as H(sup s +c). This integrated TF is then used with the appropriate input excitations to estimate vibration responses. This simple yet powerful tool enables a user to estimate vibration responses without directly using finite element models, so long as suitable H(sup s) and H(sup c) sets are defined in the database libraries. The paper discusses the preparation of the database tool and provides the assumptions and methodologies necessary to combine H(sup s) and H(sup c) sets into an integrated H(sup s + c). An experimental validation of the approach is also presented.

  14. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  15. MatchingTools: A Python library for symbolic effective field theory calculations

    Science.gov (United States)

    Criado, Juan C.

    2018-06-01

    MatchingTools is a Python library for doing symbolic calculations in effective field theory. It provides the tools to construct general models by defining their field content and their interaction Lagrangian. Once a model is given, the heavy particles can be integrated out at the tree level to obtain an effective Lagrangian in which only the light particles appear. After integration, some of the terms of the resulting Lagrangian might not be independent. MatchingTools contains functions for transforming these terms to rewrite them in terms of any chosen set of operators.

  16. Intelligent Assistants for Distributed Knowledge Acquisition, Integration, Validation, and Maintenance

    National Research Council Canada - National Science Library

    Tecuci, Gheorghe; Boicu, Mihai

    2008-01-01

    This research has developed an integrated set of tools, called Disciple 2008 learning agent shell, for continuous acquisition of knowledge directly from subject matter experts, and for the integration...

  17. On Models with Uncountable Set of Spin Values on a Cayley Tree: Integral Equations

    International Nuclear Information System (INIS)

    Rozikov, Utkir A.; Eshkobilov, Yusup Kh.

    2010-01-01

    We consider models with nearest-neighbor interactions and with the set [0, 1] of spin values, on a Cayley tree of order k ≥ 1. We reduce the problem of describing the 'splitting Gibbs measures' of the model to the description of the solutions of some nonlinear integral equation. For k = 1 we show that the integral equation has a unique solution. In case k ≥ 2 some models (with the set [0, 1] of spin values) which have a unique splitting Gibbs measure are constructed. Also for the Potts model with uncountable set of spin values it is proven that there is unique splitting Gibbs measure.

  18. Integrating Social Networking Tools into ESL Writing Classroom: Strengths and Weaknesses

    Science.gov (United States)

    Yunus, Melor Md; Salehi, Hadi; Chenzi, Chen

    2012-01-01

    With the rapid development of world and technology, English learning has become more important. Teachers frequently use teacher-centered pedagogy that leads to lack of interaction with students. This paper aims to investigate the advantages and disadvantages of integrating social networking tools into ESL writing classroom and discuss the ways to…

  19. Integrated tools for control-system analysis

    Science.gov (United States)

    Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.

    1989-01-01

    The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.

  20. AORN Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings.

    Science.gov (United States)

    Hughes, Nancy L; Nelson, Audrey; Matz, Mary W; Lloyd, John

    2011-06-01

    Prolonged standing during surgical procedures poses a high risk of causing musculoskeletal disorders, including back, leg, and foot pain, which can be chronic or acute in nature. Ergonomic Tool 4: Solutions for Prolonged Standing in Perioperative Settings provides recommendations for relieving the strain of prolonged standing, including the use of antifatigue mats, supportive footwear, and sit/stand stools, that are based on well-accepted ergonomic safety concepts, current research, and access to new and emerging technology. Published by Elsevier Inc.

  1. Improved efficiency in clinical workflow of reporting measured oncology lesions via PACS-integrated lesion tracking tool.

    Science.gov (United States)

    Sevenster, Merlijn; Travis, Adam R; Ganesh, Rajiv K; Liu, Peng; Kose, Ursula; Peters, Joost; Chang, Paul J

    2015-03-01

    OBJECTIVE. Imaging provides evidence for the response to oncology treatment by the serial measurement of reference lesions. Unfortunately, the identification, comparison, measurement, and documentation of several reference lesions can be an inefficient process. We tested the hypothesis that optimized workflow orchestration and tight integration of a lesion tracking tool into the PACS and speech recognition system can result in improvements in oncologic lesion measurement efficiency. SUBJECTS AND METHODS. A lesion management tool tightly integrated into the PACS workflow was developed. We evaluated the effect of the use of the tool on measurement reporting time by means of a prospective time-motion study on 86 body CT examinations with 241 measureable oncologic lesions with four radiologists. RESULTS. Aggregated measurement reporting time per lesion was 11.64 seconds in standard workflow, 16.67 seconds if readers had to register measurements de novo, and 6.36 seconds for each subsequent follow-up study. Differences were statistically significant (p workflow-integrated lesion management tool, especially for patients with multiple follow-up examinations, reversing the onetime efficiency penalty at baseline registration.

  2. TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools

    Science.gov (United States)

    Marlowe, Jill M.; Dixon, Genevieve D.

    1998-01-01

    This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.

  3. setsApp for Cytoscape: Set operations for Cytoscape Nodes and Edges [v2; ref status: indexed, http://f1000r.es/5lz

    Directory of Open Access Journals (Sweden)

    John H. Morris

    2015-08-01

    Full Text Available setsApp (http://apps.cytoscape.org/apps/setsapp is a relatively simple Cytoscape 3 app for users to handle groups of nodes and/or edges. It supports several important biological workflows and enables various set operations. setsApp provides basic tools to create sets of nodes or edges, import or export sets, and perform standard set operations (union, difference, intersection on those sets. Automatic set partitioning and layout functions are also provided. The sets functionality is also exposed to users and app developers in the form of a set of commands that can be used for scripting purposes or integrated in other Cytoscape apps.

  4. Tool Integration: Experiences and Issues in Using XMI and Component Technology

    DEFF Research Database (Denmark)

    Damm, Christian Heide; Hansen, Klaus Marius; Thomsen, Michael

    2000-01-01

    of conflicting data models, and provide architecture for doing so, based on component technology and XML Metadata Interchange. As an example, we discuss the implementation of an electronic whiteboard tool, Knight, which adds support for creative and collaborative object-oriented modeling to existing Computer-Aided...... Software Engineering through integration using our proposed architecture....

  5. A Prospective Validation Study of a Rainbow Model of Integrated Care Measurement Tool in Singapore.

    Science.gov (United States)

    Nurjono, Milawaty; Valentijn, Pim P; Bautista, Mary Ann C; Wei, Lim Yee; Vrijhoef, Hubertus Johannes Maria

    2016-04-08

    The conceptual ambiguity of the integrated care concept precludes a full understanding of what constitutes a well-integrated health system, posing a significant challenge in measuring the level of integrated care. Most available measures have been developed from a disease-specific perspective and only measure certain aspects of integrated care. Based on the Rainbow Model of Integrated Care, which provides a detailed description of the complex concept of integrated care, a measurement tool has been developed to assess integrated care within a care system as a whole gathered from healthcare providers' and managerial perspectives. This paper describes the methodology of a study seeking to validate the Rainbow Model of Integrated Care measurement tool within and across the Singapore Regional Health System. The Singapore Regional Health System is a recent national strategy developed to provide a better-integrated health system to deliver seamless and person-focused care to patients through a network of providers within a specified geographical region. The validation process includes the assessment of the content of the measure and its psychometric properties. If the measure is deemed to be valid, the study will provide the first opportunity to measure integrated care within Singapore Regional Health System with the results allowing insights in making recommendations for improving the Regional Health System and supporting international comparison.

  6. Tool set for distributed real-time machine control

    Science.gov (United States)

    Carrott, Andrew J.; Wright, Christopher D.; West, Andrew A.; Harrison, Robert; Weston, Richard H.

    1997-01-01

    Demands for increased control capabilities require next generation manufacturing machines to comprise intelligent building elements, physically located at the point where the control functionality is required. Networks of modular intelligent controllers are increasingly designed into manufacturing machines and usable standards are slowly emerging. To implement a control system using off-the-shelf intelligent devices from multi-vendor sources requires a number of well defined activities, including (a) the specification and selection of interoperable control system components, (b) device independent application programming and (c) device configuration, management, monitoring and control. This paper briefly discusses the support for the above machine lifecycle activities through the development of an integrated computing environment populated with an extendable software toolset. The toolset supports machine builder activities such as initial control logic specification, logic analysis, machine modeling, mechanical verification, application programming, automatic code generation, simulation/test, version control, distributed run-time support and documentation. The environment itself consists of system management tools and a distributed object-oriented database which provides storage for the outputs from machine lifecycle activities and specific target control solutions.

  7. Automated innovative diagnostic, data management and communication tool, for improving malaria vector control in endemic settings.

    Science.gov (United States)

    Vontas, John; Mitsakakis, Konstantinos; Zengerle, Roland; Yewhalaw, Delenasaw; Sikaala, Chadwick Haadezu; Etang, Josiane; Fallani, Matteo; Carman, Bill; Müller, Pie; Chouaïbou, Mouhamadou; Coleman, Marlize; Coleman, Michael

    2016-01-01

    Malaria is a life-threatening disease that caused more than 400,000 deaths in sub-Saharan Africa in 2015. Mass prevention of the disease is best achieved by vector control which heavily relies on the use of insecticides. Monitoring mosquito vector populations is an integral component of control programs and a prerequisite for effective interventions. Several individual methods are used for this task; however, there are obstacles to their uptake, as well as challenges in organizing, interpreting and communicating vector population data. The Horizon 2020 project "DMC-MALVEC" consortium will develop a fully integrated and automated multiplex vector-diagnostic platform (LabDisk) for characterizing mosquito populations in terms of species composition, Plasmodium infections and biochemical insecticide resistance markers. The LabDisk will be interfaced with a Disease Data Management System (DDMS), a custom made data management software which will collate and manage data from routine entomological monitoring activities providing information in a timely fashion based on user needs and in a standardized way. The ResistanceSim, a serious game, a modern ICT platform that uses interactive ways of communicating guidelines and exemplifying good practices of optimal use of interventions in the health sector will also be a key element. The use of the tool will teach operational end users the value of quality data (relevant, timely and accurate) to make informed decisions. The integrated system (LabDisk, DDMS & ResistanceSim) will be evaluated in four malaria endemic countries, representative of the vector control challenges in sub-Saharan Africa, (Cameroon, Ivory Coast, Ethiopia and Zambia), highly representative of malaria settings with different levels of endemicity and vector control challenges, to support informed decision-making in vector control and disease management.

  8. Modeling and evaluation of the influence of micro-EDM sparking state settings on the tool electrode wear behavior

    DEFF Research Database (Denmark)

    Puthumana, Govindan

    2017-01-01

    materials characterized by considerable wear ofthe tool used for material removal. This paper presents an investigation involving modeling and estimation of the effect of settings for generation of discharges in stable conditions of micro-EDM on the phenomenon of tool electrode wear. A stable sparking...... a condition for the minimum tool wear for this micro-EDM process configuration....

  9. The Integration of Digital Tools during Strategic and Interactive Writing Instruction

    Science.gov (United States)

    Kilpatrick, Jennifer Renée; Saulsburry, Rachel; Dostal, Hannah M.; Wolbers, Kimberly A.; Graham, Steve

    2014-01-01

    The purpose of this chapter is to gain insight from the ways a group of elementary teachers of the deaf and hard of hearing chose to integrate digital tools into evidence-based writing instruction and the ways these technologies were used to support student learning. After professional development that exposed these teachers to twelve new digital…

  10. Constructions of contents and measures satisfying a prescribed set of integral inequalities

    DEFF Research Database (Denmark)

    Hoffmann-Jørgensen, Jørgen

    2006-01-01

    Let Ψ be a given set real-valued functions on the set T and β:Ψ→R be a given functional with values in the extended real line. The paper gives sufficent conditions for the exists of a content (or a measure) μ with good regularity and smoothnes properties and with the property that β(ψ) is less than...... the outer μ-integral of ψ for all ψ in Ψ....

  11. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  12. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  13. BEopt-CA (Ex): A Tool for Optimal Integration of EE, DR and PV in Existing California Homes

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, Craig [National Renewable Energy Lab. (NREL), Golden, CO (United States); Horowitz, Scott [National Renewable Energy Lab. (NREL), Golden, CO (United States); Maguire, Jeff [National Renewable Energy Lab. (NREL), Golden, CO (United States); Velasco, Paulo Tabrares [National Renewable Energy Lab. (NREL), Golden, CO (United States); Springer, David [Davis Energy Group, Davis, CA (United States); Coates, Peter [Davis Energy Group, Davis, CA (United States); Bell, Christy [Davis Energy Group, Davis, CA (United States); Price, Snuller [Energy & Environmental Economics, San Francisco, CA (United States); Sreedharan, Priya [Energy & Environmental Economics, San Francisco, CA (United States); Pickrell, Katie [Energy & Environmental Economics, San Francisco, CA (United States)

    2014-04-01

    This project targeted the development of a software tool, BEopt-CA (Ex) (Building Energy Optimization Tool for California Existing Homes), that aims to facilitate balanced integration of energy efficiency (EE), demand response (DR), and photovoltaics (PV) in the residential retrofit1 market. The intent is to provide utility program managers and contractors in the EE/DR/PV marketplace with a means of balancing the integration of EE, DR, and PV

  14. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  15. Self-organising maps and correlation analysis as a tool to explore patterns in excitation-emission matrix data sets and to discriminate dissolved organic matter fluorescence components.

    Science.gov (United States)

    Ejarque-Gonzalez, Elisabet; Butturini, Andrea

    2014-01-01

    Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

  16. The Vicinity of Program Documentation Tools

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    between program development tools.  In the central part of the paper we introduce a mapping of documentation flow between program development tools.  In addition we discuss a set of locally developed tools which is related to program documentation.  The use of test cases as examples in an interface......Program documentation plays a vital role in almost all programming processes.  Program documentation flows between separate tools of a modularized environment, and in between the components of an integrated development environment as well.  In this paper we discuss the flow of program documentation...... documentation tool is a noteworthy and valuable contribution to the documentation flow.  As an additional contribution we identify several circular relationships which illustrate feedback of documentation to the program editor from other tools in the development environment....

  17. Integrated Data Collection Analysis (IDCA) Program - RDX Standard Data Set 2

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Air Force Research Lab. (AFRL), Tyndall Air Force Base, FL (United States); Shelley, Timothy J. [Applied Research Associates, Tyndall Air Force Base, FL (United States); Reyes, Jose A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-02-20

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, from testing the second time in the Proficiency Test. This RDX testing (Set 2) compared to the first (Set 1) was found to have about the same impact sensitivity, have more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity.

  18. ACE-it: a tool for genome-wide integration of gene dosage and RNA expression data

    NARCIS (Netherlands)

    van Wieringen, W.N.; Belien, J.A.M.; Vosse, S.; Achame, E.M.; Ylstra, B.

    2006-01-01

    Summary: We describe a tool, called ACE-it (Array CGH Expression integration tool). ACE-it links the chromosomal position of the gene dosage measured by array CGH to the genes measured by the expression array. ACE-it uses this link to statistically test whether gene dosage affects RNA expression. ©

  19. A Set of Web-based Tools for Integrating Scientific Research and Decision-Making through Systems Thinking

    Science.gov (United States)

    Currently, many policy and management decisions are made without considering the goods and services humans derive from ecosystems and the costs associated with protecting them. This approach is unlikely to be sustainable. Conceptual frameworks provide a tool for capturing, visual...

  20. Barriers to the Integration and Adoption of iPads in Schools

    DEFF Research Database (Denmark)

    Khalid, Md. Saifuddin; Kilic, Gökce; Christoffersen, Jeanette

    2015-01-01

    iPad, with its Apple platform and interoperability-dependent material conditions, bring complex barriers in its adoption and integration in secondary education system as a learning tool. In the schools’ context, it is an emerging educational technology for its affordances supporting collaborative...... learning. This systematic literature review on the barriers to the integration of iPad in primary and secondary schools is based on 13 peer-reviewed and full-text articles. The paper discusses what are the challenges in using iPad as a learning tool in primary and secondary educational settings....... The identified barriers are discussed in three broad categories or stages of innovation process in an educational context: basic challenges with the tool, barriers to the integration of the iPad in a school setting, and barriers in the use of the iPad....

  1. Bond graphs : an integrating tool for design of mechatronic systems

    International Nuclear Information System (INIS)

    Ould Bouamama, B.

    2011-01-01

    Bond graph is a powerful tool well known for dynamic modelling of multi physical systems: This is the only modelling technique to generate automatically state space or non-linear models using dedicated software tools (CAMP-G, 20-Sim, Symbols, Dymola...). Recently several fundamental theories have been developed for using a bond graph model not only for modeling but also as a real integrated tool from conceptual ideas to optimal practical realization of mechatronic system. This keynote presents a synthesis of those new theories which exploit some particular properties (such as causal, structural and behavioral) of this graphical methodology. Based on a pedagogical example, it will be shown how from a physical system (not a transfer function or state equation) and using only one representation (Bond graph), the following results can be performed: modeling (formal state equations generation), Control analysis (observability, controllability, Structural I/O decouplability, dynamic decoupling,...) diagnosis analysis (automatic generation of robust fault indicators, sensor placement, structural diagnosability) and finally sizing of actuators. The presentation will be illustrated by real industrial applications. Limits and perspectives of bond graph theory conclude the keynote.

  2. Integration of tools for binding archetypes to SNOMED CT.

    Science.gov (United States)

    Sundvall, Erik; Qamar, Rahil; Nyström, Mikael; Forss, Mattias; Petersson, Håkan; Karlsson, Daniel; Ahlfeldt, Hans; Rector, Alan

    2008-10-27

    The Archetype formalism and the associated Archetype Definition Language have been proposed as an ISO standard for specifying models of components of electronic healthcare records as a means of achieving interoperability between clinical systems. This paper presents an archetype editor with support for manual or semi-automatic creation of bindings between archetypes and terminology systems. Lexical and semantic methods are applied in order to obtain automatic mapping suggestions. Information visualisation methods are also used to assist the user in exploration and selection of mappings. An integrated tool for archetype authoring, semi-automatic SNOMED CT terminology binding assistance and terminology visualization was created and released as open source. Finding the right terms to bind is a difficult task but the effort to achieve terminology bindings may be reduced with the help of the described approach. The methods and tools presented are general, but here only bindings between SNOMED CT and archetypes based on the openEHR reference model are presented in detail.

  3. Integrated Authoring Tool for Mobile Augmented Reality-Based E-Learning Applications

    Science.gov (United States)

    Lobo, Marcos Fermin; Álvarez García, Víctor Manuel; del Puerto Paule Ruiz, María

    2013-01-01

    Learning management systems are increasingly being used to complement classroom teaching and learning and in some instances even replace traditional classroom settings with online educational tools. Mobile augmented reality is an innovative trend in e-learning that is creating new opportunities for teaching and learning. This article proposes a…

  4. Picante: R tools for integrating phylogenies and ecology.

    Science.gov (United States)

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  5. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    Science.gov (United States)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  6. CaliBayes and BASIS: integrated tools for the calibration, simulation and storage of biological simulation models.

    Science.gov (United States)

    Chen, Yuhui; Lawless, Conor; Gillespie, Colin S; Wu, Jake; Boys, Richard J; Wilkinson, Darren J

    2010-05-01

    Dynamic simulation modelling of complex biological processes forms the backbone of systems biology. Discrete stochastic models are particularly appropriate for describing sub-cellular molecular interactions, especially when critical molecular species are thought to be present at low copy-numbers. For example, these stochastic effects play an important role in models of human ageing, where ageing results from the long-term accumulation of random damage at various biological scales. Unfortunately, realistic stochastic simulation of discrete biological processes is highly computationally intensive, requiring specialist hardware, and can benefit greatly from parallel and distributed approaches to computation and analysis. For these reasons, we have developed the BASIS system for the simulation and storage of stochastic SBML models together with associated simulation results. This system is exposed as a set of web services to allow users to incorporate its simulation tools into their workflows. Parameter inference for stochastic models is also difficult and computationally expensive. The CaliBayes system provides a set of web services (together with an R package for consuming these and formatting data) which addresses this problem for SBML models. It uses a sequential Bayesian MCMC method, which is powerful and flexible, providing very rich information. However this approach is exceptionally computationally intensive and requires the use of a carefully designed architecture. Again, these tools are exposed as web services to allow users to take advantage of this system. In this article, we describe these two systems and demonstrate their integrated use with an example workflow to estimate the parameters of a simple model of Saccharomyces cerevisiae growth on agar plates.

  7. atBioNet– an integrated network analysis tool for genomics and biomarker discovery

    Directory of Open Access Journals (Sweden)

    Ding Yijun

    2012-07-01

    Full Text Available Abstract Background Large amounts of mammalian protein-protein interaction (PPI data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. Results atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks. The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. Conclusion atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools

  8. atBioNet--an integrated network analysis tool for genomics and biomarker discovery.

    Science.gov (United States)

    Ding, Yijun; Chen, Minjun; Liu, Zhichao; Ding, Don; Ye, Yanbin; Zhang, Min; Kelly, Reagan; Guo, Li; Su, Zhenqiang; Harris, Stephen C; Qian, Feng; Ge, Weigong; Fang, Hong; Xu, Xiaowei; Tong, Weida

    2012-07-20

    Large amounts of mammalian protein-protein interaction (PPI) data have been generated and are available for public use. From a systems biology perspective, Proteins/genes interactions encode the key mechanisms distinguishing disease and health, and such mechanisms can be uncovered through network analysis. An effective network analysis tool should integrate different content-specific PPI databases into a comprehensive network format with a user-friendly platform to identify key functional modules/pathways and the underlying mechanisms of disease and toxicity. atBioNet integrates seven publicly available PPI databases into a network-specific knowledge base. Knowledge expansion is achieved by expanding a user supplied proteins/genes list with interactions from its integrated PPI network. The statistically significant functional modules are determined by applying a fast network-clustering algorithm (SCAN: a Structural Clustering Algorithm for Networks). The functional modules can be visualized either separately or together in the context of the whole network. Integration of pathway information enables enrichment analysis and assessment of the biological function of modules. Three case studies are presented using publicly available disease gene signatures as a basis to discover new biomarkers for acute leukemia, systemic lupus erythematosus, and breast cancer. The results demonstrated that atBioNet can not only identify functional modules and pathways related to the studied diseases, but this information can also be used to hypothesize novel biomarkers for future analysis. atBioNet is a free web-based network analysis tool that provides a systematic insight into proteins/genes interactions through examining significant functional modules. The identified functional modules are useful for determining underlying mechanisms of disease and biomarker discovery. It can be accessed at: http://www.fda.gov/ScienceResearch/BioinformaticsTools/ucm285284.htm.

  9. Integrative set enrichment testing for multiple omics platforms

    Directory of Open Access Journals (Sweden)

    Poisson Laila M

    2011-11-01

    Full Text Available Abstract Background Enrichment testing assesses the overall evidence of differential expression behavior of the elements within a defined set. When we have measured many molecular aspects, e.g. gene expression, metabolites, proteins, it is desirable to assess their differential tendencies jointly across platforms using an integrated set enrichment test. In this work we explore the properties of several methods for performing a combined enrichment test using gene expression and metabolomics as the motivating platforms. Results Using two simulation models we explored the properties of several enrichment methods including two novel methods: the logistic regression 2-degree of freedom Wald test and the 2-dimensional permutation p-value for the sum-of-squared statistics test. In relation to their univariate counterparts we find that the joint tests can improve our ability to detect results that are marginal univariately. We also find that joint tests improve the ranking of associated pathways compared to their univariate counterparts. However, there is a risk of Type I error inflation with some methods and self-contained methods lose specificity when the sets are not representative of underlying association. Conclusions In this work we show that consideration of data from multiple platforms, in conjunction with summarization via a priori pathway information, leads to increased power in detection of genomic associations with phenotypes.

  10. Identification and implementation of end-user needs during development of a state-of-the-art modeling tool-set - 59069

    International Nuclear Information System (INIS)

    Seitz, Roger; Williamson, Mark; Gerdes, Kurt; Freshley, Mark; Dixon, Paul; Collazo, Yvette T.; Hubbard, Susan

    2012-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management, Technology Innovation and Development is supporting a multi-National Laboratory effort to develop the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is an emerging state-of-the-art scientific approach and software infrastructure for understanding and predicting contaminant fate and transport in natural and engineered systems. These modular and open-source high performance computing tools and user interfaces will facilitate integrated approaches that enable standardized assessments of performance and risk for EM cleanup and closure decisions. The ASCEM team recognized that engaging end-users in the ASCEM development process would lead to enhanced development and implementation of the ASCEM tool-sets in the user community. End-user involvement in ASCEM covers a broad spectrum of perspectives, including: performance assessment (PA) and risk assessment practitioners, research scientists, decision-makers, oversight personnel, and regulators engaged in the US DOE cleanup mission. End-users are primarily engaged in ASCEM via the ASCEM User Steering Committee (USC) and the 'user needs interface' task. Future plans also include user involvement in demonstrations of the ASCEM tools. This paper will describe the details of how end users have been engaged in the ASCEM program and will demonstrate how this involvement has strengthened both the tool development and community confidence. ASCEM tools requested by end-users specifically target modeling challenges associated with US DOE cleanup activities. The demonstration activities involve application of ASCEM tools and capabilities to representative problems at DOE sites. Selected results from the ASCEM Phase 1 demonstrations are discussed to illustrate how capabilities requested by end-users were implemented in prototype versions of the ASCEM tool. The ASCEM team engaged a variety of interested parties early in the development

  11. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  12. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-06-01

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  13. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools.

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-03-07

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  14. Integrating and analyzing medical and environmental data using ETL and Business Intelligence tools

    Science.gov (United States)

    Villar, Alejandro; Zarrabeitia, María T.; Fdez-Arroyabe, Pablo; Santurtún, Ana

    2018-03-01

    Processing data that originates from different sources (such as environmental and medical data) can prove to be a difficult task, due to the heterogeneity of variables, storage systems, and file formats that can be used. Moreover, once the amount of data reaches a certain threshold, conventional mining methods (based on spreadsheets or statistical software) become cumbersome or even impossible to apply. Data Extract, Transform, and Load (ETL) solutions provide a framework to normalize and integrate heterogeneous data into a local data store. Additionally, the application of Online Analytical Processing (OLAP), a set of Business Intelligence (BI) methodologies and practices for multidimensional data analysis, can be an invaluable tool for its examination and mining. In this article, we describe a solution based on an ETL + OLAP tandem used for the on-the-fly analysis of tens of millions of individual medical, meteorological, and air quality observations from 16 provinces in Spain provided by 20 different national and regional entities in a diverse array for file types and formats, with the intention of evaluating the effect of several environmental variables on human health in future studies. Our work shows how a sizable amount of data, spread across a wide range of file formats and structures, and originating from a number of different sources belonging to various business domains, can be integrated in a single system that researchers can use for global data analysis and mining.

  15. Data Integration Tool: Permafrost Data Debugging

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Pulsifer, P. L.; Strawhacker, C.; Yarmey, L.; Basak, R.

    2017-12-01

    We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the Global Terrestrial Network-Permafrost (GTN-P). The United States National Science Foundation funded this project through the National Snow and Ice Data Center (NSIDC) with the GTN-P to improve permafrost data access and discovery. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets (https://github.com/PermaData/DIT). Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs, incrementally interact with and evolve the widget workflows, and save those workflows for reproducibility. Taking ideas from visual programming found in the art and design domain, debugging and iterative design principles from software engineering, and the scientific data processing and analysis power of Fortran and Python it was written for interactive, iterative data manipulation, quality control, processing, and analysis of inconsistent data in an easily installable application. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets (270 sites), and is scheduled to translate 10 more datasets ( 1000 sites) from the legacy inactive site data holdings of the Frozen Ground Data Center (FGDC). Iterative development has provided the permafrost and wider scientific community with an extendable tool designed specifically for the iterative process of translating unruly data.

  16. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    International Nuclear Information System (INIS)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves; Kruzelecki, Karol

    2010-01-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  17. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    Energy Technology Data Exchange (ETDEWEB)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves [CERN, CH-1211 Geneva 23, PH Department, SFT Group (Switzerland); Kruzelecki, Karol, E-mail: stefan.roiser@cern.c, E-mail: ana.gaspar@cern.c, E-mail: yves.perrin@cern.c, E-mail: karol.kruzelecki@cern.c [CERN, CH-1211 Geneva 23, PH Department, LBC Group (Switzerland)

    2010-04-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  18. Effect of cutting fluids and cutting conditions on surface integrity and tool wear in turning of Inconel 713C

    Science.gov (United States)

    Hikiji, R.

    2018-01-01

    The trend toward downsizing of engines helps to increase the number of turbochargers around Europe. As for the turbocharger, the temperature of the exhaust gas is so high that the parts made of nickel base super alloy Inconel 713C are used as high temperature strength metals. External turning of Inconel 713C which is used as the actual automotive parts was carried out. The effect of the cutting fluids and cutting conditions on the surface integrity and tool wear was investigated, considering global environment and cost performance. As a result, in the range of the cutting conditions used this time, when the depth of cut was small, the good surface integrity and tool life were obtained. However, in the case of the large corner radius, it was found that the more the cutting length increased, the more the tool wear increased. When the cutting length is so large, the surface integrity and tool life got worse. As for the cutting fluids, it was found that the synthetic type showed better performance in the surface integrity and tool life than the conventional emulsion. However, it was clear that the large corner radius made the surface roughness and tool life good, but it affected the size error etc. in machining the workpiece held in a cantilever style.

  19. Electronic Dictionary as a Tool for Integration of Additional Learning Content

    Directory of Open Access Journals (Sweden)

    Stefka Kovacheva

    2015-12-01

    Full Text Available Electronic Dictionary as a Tool for Integration of Additional Learning Content This article discusses electronic dictionary as an element of the „Bulgarian cultural and historical heritage under the protection of UNESCO” database developed in IMI (BAS, that will be used to integrate additional learning content. The electronic dictionary is described as an easily accessible book of reference, offering information to the shape, meaning, usage and the origin of words in connection to the cultural-historical heritage sites in Bulgaria, protected by UNESCO. The dictionary targets 9–11 year old students from Bulgarian schools, who study the subjects “Man and Society” in 4th grade and “History and Civilization” in 5th grade.

  20. Integrating cultural community psychology: activity settings and the shared meanings of intersubjectivity.

    Science.gov (United States)

    O'Donnell, Clifford R; Tharp, Roland G

    2012-03-01

    Cultural and community psychology share a common emphasis on context, yet their leading journals rarely cite each other's articles. Greater integration of the concepts of culture and community within and across their disciplines would enrich and facilitate the viability of cultural community psychology. The contextual theory of activity settings is proposed as one means to integrate the concepts of culture and community in cultural community psychology. Through shared activities, participants develop common experiences that affect their psychological being, including their cognitions, emotions, and behavioral development. The psychological result of these experiences is intersubjectivity. Culture is defined as the shared meanings that people develop through their common historic, linguistic, social, economic, and political experiences. The shared meanings of culture arise through the intersubjectivity developed in activity settings. Cultural community psychology presents formidable epistemological challenges, but overcoming these challenges could contribute to the transformation and advancement of community psychology.

  1. An Exploration of the Effectiveness of an Audit Simulation Tool in a Classroom Setting

    Science.gov (United States)

    Zelin, Robert C., II

    2010-01-01

    The purpose of this study was to examine the effectiveness of using an audit simulation product in a classroom setting. Many students and professionals feel that a disconnect exists between learning auditing in the classroom and practicing auditing in the workplace. It was hoped that the introduction of an audit simulation tool would help to…

  2. Exploring Challenges and Opportunities of Coproduction: USDA Climate Hub Efforts to Integrate Coproduction with Applied Research and Decision Support Tool Development in the Northwest

    Science.gov (United States)

    Roesch-McNally, G.; Prendeville, H. R.

    2017-12-01

    A lack of coproduction, the joint production of new technologies or knowledge among technical experts and other groups, is arguably one of the reasons why much scientific information and resulting decision support systems are not very usable. Increasingly, public agencies and academic institutions are emphasizing the importance of coproduction of scientific knowledge and decision support systems in order to facilitate greater engagement between the scientific community and key stakeholder groups. Coproduction has been embraced as a way for the scientific community to develop actionable scientific information that will assist end users in solving real-world problems. Increasing the level of engagement and stakeholder buy-in to the scientific process is increasingly necessary, particularly in the context of growing politicization of science and the scientific process. Coproduction can be an effective way to build trust and can build-on and integrate local and traditional knowledge. Employing coproduction strategies may enable the development of more relevant and useful information and decision support tools that address stakeholder challenges at relevant scales. The USDA Northwest Climate Hub has increasingly sought ways to integrate coproduction in the development of both applied research projects and the development of decision support systems. Integrating coproduction, however, within existing institutions is not always simple, given that coproduction is often more focused on process than products and products are, for better or worse, often the primary focus of applied research and tool development projects. The USDA Northwest Climate Hub sought to integrate coproduction into our FY2017 call for proposal process. As a result we have a set of proposals and fledgling projects that fall along the engagement continuum (see Figure 1- attached). We will share the challenges and opportunities that emerged from this purposeful integration of coproduction into the work

  3. Developmental screening tools: feasibility of use at primary healthcare level in low- and middle-income settings.

    Science.gov (United States)

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-06-01

    An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools.

  4. Application of a faith-based integration tool to assess mental and physical health interventions.

    Science.gov (United States)

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  5. Development of Application Programming Tool for Safety Grade PLC (POSAFE-Q)

    International Nuclear Information System (INIS)

    Koo, Kyungmo; You, Byungyong; Kim, Tae-Wook; Cho, Sengjae; Lee, Jin S.

    2006-01-01

    The pSET (POSAFE-Q Software Engineering Tool) is an application programming tool of the POSAFE-Q which is a safety graded programmable logic controller (PLC) developed for the reactor protect system of the nuclear power plant. The pSET provides an integrated development environment (IDE) which includes editors, compiler, simulator, down loader, debugger, and monitor. The pSET supports the IEC61131-3 standard software model and languages such as LD (ladder diagram) and FBD (function block diagram) which are two of the most widely used PLC programming languages in industry fields. The pSET will also support SFC (sequential function chart) language. The pSET is developed as a part of a Korea Nuclear Instrumentation and Control System (KNICS) project

  6. An introduction to Space Weather Integrated Modeling

    Science.gov (United States)

    Zhong, D.; Feng, X.

    2012-12-01

    The need for a software toolkit that integrates space weather models and data is one of many challenges we are facing with when applying the models to space weather forecasting. To meet this challenge, we have developed Space Weather Integrated Modeling (SWIM) that is capable of analysis and visualizations of the results from a diverse set of space weather models. SWIM has a modular design and is written in Python, by using NumPy, matplotlib, and the Visualization ToolKit (VTK). SWIM provides data management module to read a variety of spacecraft data products and a specific data format of Solar-Interplanetary Conservation Element/Solution Element MHD model (SIP-CESE MHD model) for the study of solar-terrestrial phenomena. Data analysis, visualization and graphic user interface modules are also presented in a user-friendly way to run the integrated models and visualize the 2-D and 3-D data sets interactively. With these tools we can locally or remotely analysis the model result rapidly, such as extraction of data on specific location in time-sequence data sets, plotting interplanetary magnetic field lines, multi-slicing of solar wind speed, volume rendering of solar wind density, animation of time-sequence data sets, comparing between model result and observational data. To speed-up the analysis, an in-situ visualization interface is used to support visualizing the data 'on-the-fly'. We also modified some critical time-consuming analysis and visualization methods with the aid of GPU and multi-core CPU. We have used this tool to visualize the data of SIP-CESE MHD model in real time, and integrated the Database Model of shock arrival, Shock Propagation Model, Dst forecasting model and SIP-CESE MHD model developed by SIGMA Weather Group at State Key Laboratory of Space Weather/CAS.

  7. Integrating mHealth at point of care in low- and middle-income settings: the system perspective.

    Science.gov (United States)

    Wallis, Lee; Blessing, Paul; Dalwai, Mohammed; Shin, Sang Do

    2017-06-01

    While the field represents a wide spectrum of products and services, many aspects of mHealth have great promise within resource-poor settings: there is an extensive range of cheap, widely available tools which can be used at the point of care delivery. However, there are a number of conditions which need to be met if such solutions are to be adequately integrated into existing health systems; we consider these from regulatory, technological and user perspectives. We explore the need for an appropriate legislative and regulatory framework, to avoid 'work around' solutions, which threaten patient confidentiality (such as the extensive use of instant messaging services to deliver sensitive clinical information and seek diagnostic and management advice). In addition, we will look at other confidentiality issues such as the need for applications to remove identifiable information (such as photos) from users' devices. Integration is dependent upon multiple technological factors, and we illustrate these using examples such as products made available specifically for adoption in low- and middle-income countries. Issues such as usability of the application, signal loss, data volume utilization, need to enter passwords, and the availability of automated or in-app context-relevant clinical advice will be discussed. From a user perspective, there are three groups to consider: experts, front-line clinicians, and patients. Each will accept, to different degrees, the use of technology in care - often with cultural or regional variation - and this is central to integration and uptake. For clinicians, ease of integration into daily work flow is critical, as are familiarity and acceptability of other technology in the workplace. Front-line staff tend to work in areas with more challenges around cell phone signal coverage and data availability than 'back-end' experts, and the effect of this is discussed.

  8. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    Science.gov (United States)

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Lessons learned developing a diagnostic tool for HIV-associated dementia feasible to implement in resource-limited settings: pilot testing in Kenya.

    Directory of Open Access Journals (Sweden)

    Judith Kwasa

    Full Text Available To conduct a preliminary evaluation of the utility and reliability of a diagnostic tool for HIV-associated dementia (HAD for use by primary health care workers (HCW which would be feasible to implement in resource-limited settings.In resource-limited settings, HAD is an indication for anti-retroviral therapy regardless of CD4 T-cell count. Anti-retroviral therapy, the treatment for HAD, is now increasingly available in resource-limited settings. Nonetheless, HAD remains under-diagnosed likely because of limited clinical expertise and availability of diagnostic tests. Thus, a simple diagnostic tool which is practical to implement in resource-limited settings is an urgent need.A convenience sample of 30 HIV-infected outpatients was enrolled in Western Kenya. We assessed the sensitivity and specificity of a diagnostic tool for HAD as administered by a primary HCW. This was compared to an expert clinical assessment which included examination by a physician, neuropsychological testing, and in selected cases, brain imaging. Agreement between HCW and an expert examiner on certain tool components was measured using Kappa statistic.The sample was 57% male, mean age was 38.6 years, mean CD4 T-cell count was 323 cells/µL, and 54% had less than a secondary school education. Six (20% of the subjects were diagnosed with HAD by expert clinical assessment. The diagnostic tool was 63% sensitive and 67% specific for HAD. Agreement between HCW and expert examiners was poor for many individual items of the diagnostic tool (K = .03-.65. This diagnostic tool had moderate sensitivity and specificity for HAD. However, reliability was poor, suggesting that substantial training and formal evaluations of training adequacy will be critical to enable HCW to reliably administer a brief diagnostic tool for HAD.

  10. INSIGHT: an integrated scoping analysis tool for in-core fuel management of PWR

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Noda, Hidefumi; Ito, Nobuaki; Maruyama, Taiji.

    1997-01-01

    An integrated software tool for scoping analysis of in-core fuel management, INSIGHT, has been developed to automate the scoping analysis and to improve the fuel cycle cost using advanced optimization techniques. INSIGHT is an interactive software tool executed on UNIX based workstations that is equipped with an X-window system. INSIGHT incorporates the GALLOP loading pattern (LP) optimization module that utilizes hybrid genetic algorithms, the PATMAKER interactive LP design module, the MCA multicycle analysis module, an integrated database, and other utilities. Two benchmark problems were analyzed to confirm the key capabilities of INSIGHT: LP optimization and multicycle analysis. The first was the single cycle LP optimization problem that included various constraints. The second one was the multicycle LP optimization problem that includes the assembly burnup limitation at rod cluster control (RCC) positions. The results for these problems showed the feasibility of INSIGHT for the practical scoping analysis, whose work almost consists of LP generation and multicycle analysis. (author)

  11. A Conceptual Framework for Integration of Evidence-Based Design with Lighting Simulation Tools

    Directory of Open Access Journals (Sweden)

    Anahita Davoodi

    2017-09-01

    Full Text Available The use of lighting simulation tools has been growing over the past years which has improved lighting analysis. While computer simulations have proven to be a viable tool for analyzing lighting in physical environments, they have difficulty in assessing the effects of light on occupant’s perception. Evidence-based design (EBD is a design method that is gaining traction in building design due to its strength in providing means to assess the effects of built environments on humans. The aim of this study was to develop a conceptual framework for integrating EBD with lighting simulation tools. Based on a literature review, it was investigated how EBD and lighting simulation can be combined to provide a holistic lighting performance evaluation method. The results show that they can mutually benefit from each other. EBD makes it possible to evaluate and/or improve performance metrics by utilizing user feedback. On the other hand, performance metrics can be used for a better description of evidence, and to analyze the effects of lighting with more details. The results also show that EBD can be used to evaluate light simulations to better understand when and how they should be performed. A framework is presented for integration of lighting simulation and EBD.

  12. Reducing Information Overload in Large Seismic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    HAMPTON,JEFFERY W.; YOUNG,CHRISTOPHER J.; MERCHANT,BION J.; CARR,DORTHE B.; AGUILAR-CHANG,JULIO

    2000-08-02

    into how to winnow the reference events in these large reconciled event sets, additional database query approaches have been developed to provide windows into these datasets. These custom built content analysis tools help identify dataset characteristics that can potentially aid in providing a basis for comparing similar reference events in these large reconciled event sets. Once these characteristics can be identified, algorithms can be developed to create and add to the reduced set of events used by the Event Search Engine. These content analysis tools have already been useful in providing information on station coverage of the referenced events and basic statistical, information on events in the research datasets. The tools can also provide researchers with a quick way to find interesting and useful events within the research datasets. The tools could also be used as a means to review reference event datasets as part of a dataset delivery verification process. There has also been an effort to explore the usefulness of commercially available web-based software to help with this problem. The advantages of using off-the-shelf software applications, such as Oracle's WebDB, to manipulate, customize and manage research data are being investigated. These types of applications are being examined to provide access to large integrated data sets for regional seismic research in Asia. All of these software tools would provide the researcher with unprecedented power without having to learn the intricacies and complexities of relational database systems.

  13. Barriers to the Integration of Computers in Early Childhood Settings: Teachers' Perceptions

    Science.gov (United States)

    Nikolopoulou, Kleopatra; Gialamas, Vasilis

    2015-01-01

    This study investigated teachers' perceptions of barriers to using - integrating computers in early childhood settings. A 26-item questionnaire was administered to 134 early childhood teachers in Greece. Lack of funding, lack of technical and administrative support, as well as inadequate training opportunities were among the major perceived…

  14. Integration at the round table: marine spatial planning in multi-stakeholder settings.

    Science.gov (United States)

    Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen

    2014-01-01

    Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and related priorities), external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors) is successful in all three countries, achieved through the use of neutral 'round-table' meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes.

  15. Integration at the round table: marine spatial planning in multi-stakeholder settings.

    Directory of Open Access Journals (Sweden)

    Erik Olsen

    Full Text Available Marine spatial planning (MSP is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments, economic activities (and related priorities, external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors is successful in all three countries, achieved through the use of neutral 'round-table' meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes.

  16. Integration at the Round Table: Marine Spatial Planning in Multi-Stakeholder Settings

    Science.gov (United States)

    Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen

    2014-01-01

    Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and related priorities), external drivers, spatial scales, incentives and objectives, varying approaches to legislation and political will. We compared MSP in Belgium, Norway and the US to illustrate how the integration of stakeholders and governmental levels differs among these countries along the factors mentioned above. Horizontal integration (between sectors) is successful in all three countries, achieved through the use of neutral ‘round-table’ meeting places for all actors. Vertical integration between government levels varies, with Belgium and Norway having achieved full integration while the US lacks integration of the legislature due to sharp disagreements among stakeholders and unsuccessful partisan leadership. Success factors include political will and leadership, process transparency and stakeholder participation, and should be considered in all MSP development processes. PMID:25299595

  17. OpenMeetings as a browser-based teleconferencing tool for EFDA laboratories

    Energy Technology Data Exchange (ETDEWEB)

    Santos, B., E-mail: bsantos@ipfn.ist.utl.pt [Associacao EURATOM/IST, Instituto de Plasmas e Fusao Nuclear - Laboratorio Associado, Instituto Superior Tecnico, P-1049-001 Lisboa (Portugal); Castro, R. [Asociacion EURATOM/CIEMAT Para Fusion, 28040 Madrid (Spain); Santos, J.H.; Gomes, D.; Fernandes, H.; Sousa, J.; Varandas, C.A.F. [Associacao EURATOM/IST, Instituto de Plasmas e Fusao Nuclear - Laboratorio Associado, Instituto Superior Tecnico, P-1049-001 Lisboa (Portugal)

    2011-10-15

    Remote participation is a key success factor for worldwide organizations and video-conference is an important aspect when laboratory collaborators are geographically dispersed. Several tools for video-conference are available worldwide. However, there is a need for a default integrated set of tools which provide all the needed features for remote participation between cooperating laboratories. Moreover, laboratories currently use different video-conference tools to communicate leading to a non-compatible method among them. OpenMeetings is open source software which provides video-conferences through web-browsing. It is operating system independent, requires no computer installation providing easy integration with already existing tools and customization of requirements. This paper intends to give an OpenMeetings overview as a video-conference application between worldwide laboratories and to demonstrate its integration with the European Fusion Development Agreement (EFDA) Federation authentication and authorization features. EFDA Federation is a multi-organization security infrastructure, whose purpose is to provide sharing and provision of a wide set of resources so that authenticated users may access them in a transparent and secure mode. This infrastructure is based on a distributed security technology called PAPI (Point of Access to Providers of Information). OpenMeetings can be a valuable EFDA Federation resource and a good approach as a method to provide a worldwide laboratory video-conferencing application, based on standard browsers.

  18. OpenMeetings as a browser-based teleconferencing tool for EFDA laboratories

    International Nuclear Information System (INIS)

    Santos, B.; Castro, R.; Santos, J.H.; Gomes, D.; Fernandes, H.; Sousa, J.; Varandas, C.A.F.

    2011-01-01

    Remote participation is a key success factor for worldwide organizations and video-conference is an important aspect when laboratory collaborators are geographically dispersed. Several tools for video-conference are available worldwide. However, there is a need for a default integrated set of tools which provide all the needed features for remote participation between cooperating laboratories. Moreover, laboratories currently use different video-conference tools to communicate leading to a non-compatible method among them. OpenMeetings is open source software which provides video-conferences through web-browsing. It is operating system independent, requires no computer installation providing easy integration with already existing tools and customization of requirements. This paper intends to give an OpenMeetings overview as a video-conference application between worldwide laboratories and to demonstrate its integration with the European Fusion Development Agreement (EFDA) Federation authentication and authorization features. EFDA Federation is a multi-organization security infrastructure, whose purpose is to provide sharing and provision of a wide set of resources so that authenticated users may access them in a transparent and secure mode. This infrastructure is based on a distributed security technology called PAPI (Point of Access to Providers of Information). OpenMeetings can be a valuable EFDA Federation resource and a good approach as a method to provide a worldwide laboratory video-conferencing application, based on standard browsers.

  19. Integrating Wikis as Educational Tools for the Development of a Community of Inquiry

    Science.gov (United States)

    Eteokleous, Nikleia; Ktoridou, Despo; Orphanou, Maria

    2014-01-01

    This article describes a study that attempted to evaluate the integration of wikis as an educational tool in successfully achieving the learning objectives of a fifth-grade linguistics and literature course. A mixed-method approach was employed--data were collected via questionnaires, reflective journals, observations, and interviews. The results…

  20. An efficient tool for metabolic pathway construction and gene integration for Aspergillus niger.

    Science.gov (United States)

    Sarkari, Parveen; Marx, Hans; Blumhoff, Marzena L; Mattanovich, Diethard; Sauer, Michael; Steiger, Matthias G

    2017-12-01

    Metabolic engineering requires functional genetic tools for easy and quick generation of multiple pathway variants. A genetic engineering toolbox for A. niger is presented, which facilitates the generation of strains carrying heterologous expression cassettes at a defined genetic locus. The system is compatible with Golden Gate cloning, which facilitates the DNA construction process and provides high design flexibility. The integration process is mediated by a CRISPR/Cas9 strategy involving the cutting of both the genetic integration locus (pyrG) as well as the integrating plasmid. Only a transient expression of Cas9 is necessary and the carrying plasmid is readily lost using a size-reduced AMA1 variant. A high integration efficiency into the fungal genome of up to 100% can be achieved, thus reducing the screening process significantly. The feasibility of the approach was demonstrated by the integration of an expression cassette enabling the production of aconitic acid in A. niger. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki; Beyer, Johanna; Hadwiger, Markus; Blue, Rusty; Law, Charles; Vá zquez Reina, Amelio; Reid, Rollie Clay; Lichtman, Jeff W M D; Pfister, Hanspeter

    2010-01-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  2. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki

    2010-05-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  3. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  4. Peac – A set of tools to quickly enable Proof on a cluster

    International Nuclear Information System (INIS)

    Ganis, G; Vala, M

    2012-01-01

    With advent of the analysis phase of Lhcdata-processing, interest in Proof technology has considerably increased. While setting up a simple Proof cluster for basic usage is reasonably straightforward, exploiting the several new functionalities added in recent times may be complicated. Peac, standing for Proof Enabled Analysis Cluster, is a set of tools aiming to facilitate the setup and management of a Proof cluster. Peac is based on the experience made by setting up Proof for the Alice analysis facilities. It allows to easily build and configure Root and the additional software needed on the cluster, and may serve as distributor of binaries via Xrootd. Peac uses Proof-On-Demand (PoD) for resource management (start, stop or daemons). Finally, Peac sets-up and configures dataset management (using the Afdsmgrd daemon), as well as cluster monitoring (machine status and Proof query summaries) using MonAlisa. In this respect, a MonAlisa page has been dedicated to Peac users, so that a cluster managed by Peac can be automatically monitored. In this paper we present and describe the status and main components of Peac and show details about its usage.

  5. Stakeholder views of management and decision support tools to integrate climate change into Great Lakes Lake Whitefish management

    Science.gov (United States)

    Lynch, Abigail J.; Taylor, William W.; McCright, Aaron M.

    2016-01-01

    Decision support tools can aid decision making by systematically incorporating information, accounting for uncertainties, and facilitating evaluation between alternatives. Without user buy-in, however, decision support tools can fail to influence decision-making processes. We surveyed fishery researchers, managers, and fishers affiliated with the Lake Whitefish Coregonus clupeaformis fishery in the 1836 Treaty Waters of Lakes Huron, Michigan, and Superior to assess opinions of current and future management needs to identify barriers to, and opportunities for, developing a decision support tool based on Lake Whitefish recruitment projections with climate change. Approximately 64% of 39 respondents were satisfied with current management, and nearly 85% agreed that science was well integrated into management programs. Though decision support tools can facilitate science integration into management, respondents suggest that they face significant implementation barriers, including lack of political will to change management and perceived uncertainty in decision support outputs. Recommendations from this survey can inform development of decision support tools for fishery management in the Great Lakes and other regions.

  6. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  7. The Berry Amendment: A Comprehensive Look at the History and Implications for Program Managers of Hand- and Measuring-Tool-Intensive Programs

    Science.gov (United States)

    2014-09-01

    trailer mounted maintenance systems. Soldier portable maintenance kits are tool assemblages integrated into storage cases that can be readily...SKOT maintenance system procurement strategies rely on acquiring commercial off-the-shelf (COTS) tool loads for sets and kits and integrating those tool... footwear ,” 2014). Acquisition professionals must understand the Berry Amendment because it is here to stay. Over the years there have been attempts to

  8. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  9. A database and tool, IM Browser, for exploring and integrating emerging gene and protein interaction data for Drosophila

    Directory of Open Access Journals (Sweden)

    Parrish Jodi R

    2006-04-01

    Full Text Available Abstract Background Biological processes are mediated by networks of interacting genes and proteins. Efforts to map and understand these networks are resulting in the proliferation of interaction data derived from both experimental and computational techniques for a number of organisms. The volume of this data combined with the variety of specific forms it can take has created a need for comprehensive databases that include all of the available data sets, and for exploration tools to facilitate data integration and analysis. One powerful paradigm for the navigation and analysis of interaction data is an interaction graph or map that represents proteins or genes as nodes linked by interactions. Several programs have been developed for graphical representation and analysis of interaction data, yet there remains a need for alternative programs that can provide casual users with rapid easy access to many existing and emerging data sets. Description Here we describe a comprehensive database of Drosophila gene and protein interactions collected from a variety of sources, including low and high throughput screens, genetic interactions, and computational predictions. We also present a program for exploring multiple interaction data sets and for combining data from different sources. The program, referred to as the Interaction Map (IM Browser, is a web-based application for searching and visualizing interaction data stored in a relational database system. Use of the application requires no downloads and minimal user configuration or training, thereby enabling rapid initial access to interaction data. IM Browser was designed to readily accommodate and integrate new types of interaction data as it becomes available. Moreover, all information associated with interaction measurements or predictions and the genes or proteins involved are accessible to the user. This allows combined searches and analyses based on either common or technique-specific attributes

  10. Web-Enabled Mechanistic Case Diagramming: A Novel Tool for Assessing Students' Ability to Integrate Foundational and Clinical Sciences.

    Science.gov (United States)

    Ferguson, Kristi J; Kreiter, Clarence D; Haugen, Thomas H; Dee, Fred R

    2018-02-20

    As medical schools move from discipline-based courses to more integrated approaches, identifying assessment tools that parallel this change is an important goal. The authors describe the use of test item statistics to assess the reliability and validity of web-enabled mechanistic case diagrams (MCDs) as a potential tool to assess students' ability to integrate basic science and clinical information. Students review a narrative clinical case and construct an MCD using items provided by the case author. Students identify the relationships among underlying risk factors, etiology, pathogenesis and pathophysiology, and the patients' signs and symptoms. They receive one point for each correctly-identified link. In 2014-15 and 2015-16, case diagrams were implemented in consecutive classes of 150 medical students. The alpha reliability coefficient for the overall score, constructed using each student's mean proportion correct across all cases, was 0.82. Discrimination indices for each of the case scores with the overall score ranged from 0.23 to 0.51. In a G study using those students with complete data (n = 251) on all 16 cases, 10% of the variance was true score variance, and systematic case variance was large. Using 16 cases generated a G coefficient (relative score reliability) equal to .72 and a Phi equal to .65. The next phase of the project will involve deploying MCDs in higher-stakes settings to determine whether similar results can be achieved. Further analyses will determine whether these assessments correlate with other measures of higher-order thinking skills.

  11. Potentials for the use of tool-integrated in-line data acquisition systems in press shops

    Science.gov (United States)

    Maier, S.; Schmerbeck, T.; Liebig, A.; Kautz, T.; Volk, W.

    2017-09-01

    Robust in-line data acquisition systems are required for the realization of process monitoring and control systems in press shops. A promising approach is the integration of sensors in the following press tools. There they can be easy integrated and maintained. It also achieves the necessary robustness for the rough press environment. Such concepts were already investigated for the measurement of the geometrical accuracy as well as for the material flow of inner part areas. They enable the monitoring of each produced part’s quality. An important success factor are practical approaches to the use of this new process information in press shops. This work presents various applications of these measuring concepts, based on real car body components of the BMW Group. For example, the procedure of retroactive error analysis is explained for a side frame. It also shows how this data acquisition can be used for the optimization of drawing tools in tool shops. With the skid-line, there is a continuous value that can be monitored from planning to serial production.

  12. Integration of ROOT Notebooks as an ATLAS analysis web-based tool in outreach and public data release

    CERN Document Server

    Sanchez, Arturo; The ATLAS collaboration

    2016-01-01

    The integration of the ROOT data analysis framework with the Jupyter Notebook technology presents an incredible potential in the enhance and expansion of educational and training programs: starting from university students in their early years, passing to new ATLAS PhD students and post doctoral researchers, to those senior analysers and professors that want to restart their contact with the analysis of data or to include a more friendly but yet very powerful open source tool in the classroom. Such tools have been already tested in several environments and a fully web-based integration together with Open Access Data repositories brings the possibility to go a step forward in the search of ATLAS for integration between several CERN projects in the field of the education and training, developing new computing solutions on the way.

  13. Application of the NCSA Habanero tool for collaboration on structural integrity assessments

    International Nuclear Information System (INIS)

    Bass, B.R.; Kruse, K.; Dodds, R.H. Jr.; Malik, S.N.M.

    1998-11-01

    The Habanero software was developed by the National Center for Superconducting Applications at the University of Illinois, Urbana-Champaign, as a framework for the collaborative sharing of Java applications. The Habanero tool performs distributed communication of single-user, computer software interactions to a multiuser collaborative environment. An investigation was conducted to evaluate the capabilities of the Habanero tool in providing an Internet-based collaborative framework for researchers located at different sites and operating on different workstations. These collaborative sessions focused on the sharing of test data and analysis results from materials engineering areas (i.e., fracture mechanics and structural integrity evaluations) related to reactor pressure vessel safety research sponsored by the US Nuclear Regulatory Commission. This report defines collaborative-system requirements for engineering applications and provides an overview of collaborative systems within the project. The installation, application, and detailed evaluation of the performance of the Habanero collaborative tool are compared to those of another commercially available collaborative product. Recommendations are given for future work in collaborative communications

  14. Integrative Functional Genomics for Systems Genetics in GeneWeaver.org.

    Science.gov (United States)

    Bubier, Jason A; Langston, Michael A; Baker, Erich J; Chesler, Elissa J

    2017-01-01

    The abundance of existing functional genomics studies permits an integrative approach to interpreting and resolving the results of diverse systems genetics studies. However, a major challenge lies in assembling and harmonizing heterogeneous data sets across species for facile comparison to the positional candidate genes and coexpression networks that come from systems genetic studies. GeneWeaver is an online database and suite of tools at www.geneweaver.org that allows for fast aggregation and analysis of gene set-centric data. GeneWeaver contains curated experimental data together with resource-level data such as GO annotations, MP annotations, and KEGG pathways, along with persistent stores of user entered data sets. These can be entered directly into GeneWeaver or transferred from widely used resources such as GeneNetwork.org. Data are analyzed using statistical tools and advanced graph algorithms to discover new relations, prioritize candidate genes, and generate function hypotheses. Here we use GeneWeaver to find genes common to multiple gene sets, prioritize candidate genes from a quantitative trait locus, and characterize a set of differentially expressed genes. Coupling a large multispecies repository curated and empirical functional genomics data to fast computational tools allows for the rapid integrative analysis of heterogeneous data for interpreting and extrapolating systems genetics results.

  15. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand.

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-12-01

    Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. An eco-friendly dengue vector control programme was successfully implemented in

  16. Application of eco-friendly tools and eco-bio-social strategies to control dengue vectors in urban and peri-urban settings in Thailand

    Science.gov (United States)

    Kittayapong, Pattamaporn; Thongyuan, Suporn; Olanratmanee, Phanthip; Aumchareoun, Worawit; Koyadun, Surachart; Kittayapong, Rungrith; Butraporn, Piyarat

    2012-01-01

    Background Dengue is considered one of the most important vector-borne diseases in Thailand. Its incidence is increasing despite routine implementation of national dengue control programmes. This study, conducted during 2010, aimed to demonstrate an application of integrated, community-based, eco-bio-social strategies in combination with locally-produced eco-friendly vector control tools in the dengue control programme, emphasizing urban and peri-urban settings in eastern Thailand. Methodology Three different community settings were selected and were randomly assigned to intervention and control clusters. Key community leaders and relevant governmental authorities were approached to participate in this intervention programme. Ecohealth volunteers were identified and trained in each study community. They were selected among active community health volunteers and were trained by public health experts to conduct vector control activities in their own communities using environmental management in combination with eco-friendly vector control tools. These trained ecohealth volunteers carried out outreach health education and vector control during household visits. Management of public spaces and public properties, especially solid waste management, was efficiently carried out by local municipalities. Significant reduction in the pupae per person index in the intervention clusters when compared to the control ones was used as a proxy to determine the impact of this programme. Results Our community-based dengue vector control programme demonstrated a significant reduction in the pupae per person index during entomological surveys which were conducted at two-month intervals from May 2010 for the total of six months in the intervention and control clusters. The programme also raised awareness in applying eco-friendly vector control approaches and increased intersectoral and household participation in dengue control activities. Conclusion An eco-friendly dengue vector control

  17. Simultaneous Scheduling of Jobs, AGVs and Tools Considering Tool Transfer Times in Multi Machine FMS By SOS Algorithm

    Science.gov (United States)

    Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.

    2017-08-01

    This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.

  18. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  19. Qualification of integrated tool environments (QUITE) for the development of computer-based safety systems in NPP

    International Nuclear Information System (INIS)

    Miedl, Horst

    2004-01-01

    In NPP I et C systems are back fitted meanwhile increasingly by computer-based systems (I et C platforms). The corresponding safety functions are implemented by software, and this software is developed, configured and administrated with the help of integrated tool environments (ITE). An ITE offers a set of services which are used to construct an I et C system and consist typically of software packages for project control and documentation, specification and design, automatic code generation and so on. Commercial ITE are not necessarily conceived and qualified (type-tested) for nuclear specific applications but are used - and will increasingly be used - for the implementation of nuclear safety related I et C systems. Therefor, it is necessary to qualify commercial ITE with respect to their influence on the quality of the target system for each I et C platform (dependent on the safety category of the target system). Examples for commercial ITEs are I et C platforms like SPINLINE 3, TELEPERM XP, Common Q, TRICON, etc. (Author)

  20. Collaborative Digital Games as Mediation Tool to Foster Intercultural Integration in Primary Dutch Schools

    NARCIS (Netherlands)

    A. Paz Alencar (Amanda); T. de la Hera Conde-Pumpido (Teresa)

    2015-01-01

    textabstractIn the Netherlands, the growing presence of immigrant children in schools has fueled scholarly interest in and concerns for examining the process of integration in school environments. The use of digital games has found to be an effective tool to reinforce teaching/learning practices.

  1. Integrated waste management and the tool of life cycle inventory : a route to sustainable waste management

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, F.R.; White, P.R. [Procter and Gamble Newcastle Technical Centre, Newcastle (United Kingdom). Corporate Sustainable Development

    2000-07-01

    An overall approach to municipal waste management which integrates sustainable development principles was discussed. The three elements of sustainability which have to be balanced are environmental effectiveness, economic affordability and social acceptability. An integrated waste management (IWM) system considers different treatment options and deals with the entire waste stream. A life cycle inventory (LCI) and life cycle assessment (LCA) is used to determine the environmental burdens associated with IWM systems. LCIs for waste management are currently available for use in Europe, the United States, Canada and elsewhere. LCI is being used by waste management companies to assess the environmental attributes of future contract tenders. The models are used as benchmarking tools to assess the current environmental profile of a waste management system. They are also a comparative planning and communication tool. The authors are currently looking into publishing, at a future date, the experience of users of this LCI environmental management tool. 12 refs., 3 figs.

  2. Experimental investigation into effect of cutting parameters on surface integrity of hardened tool steel

    Science.gov (United States)

    Bashir, K.; Alkali, A. U.; Elmunafi, M. H. S.; Yusof, N. M.

    2018-04-01

    Recent trend in turning hardened materials have gained popularity because of its immense machinability benefits. However, several machining processes like thermal assisted machining and cryogenic machining have reveal superior machinability benefits over conventional dry turning of hardened materials. Various engineering materials have been studied. However, investigations on AISI O1 tool steel have not been widely reported. In this paper, surface finish and surface integrity dominant when hard turning AISI O1 tool steel is analysed. The study is focused on the performance of wiper coated ceramic tool with respect to surface roughness and surface integrity of hardened tool steel. Hard turned tool steel was machined at varying cutting speed of 100, 155 and 210 m/min and feed rate of 0.05, 0.125 and 0.20mm/rev. The depth of cut of 0.2mm was maintained constant throughout the machining trials. Machining was conducted using dry turning on 200E-axis CNC lathe. The experimental study revealed that the surface finish is relatively superior at higher cutting speed of 210m/min. The surface finish increases when cutting speed increases whereas surface finish is generally better at lower feed rate of 0.05mm/rev. The experimental study conducted have revealed that phenomena such as work piece vibration due to poor or improper mounting on the spindle also contributed to higher surface roughness value of 0.66Ra during turning at 0.2mm/rev. Traces of white layer was observed when viewed with optical microscope which shows evidence of cutting effects on the turned work material at feed rate of 0.2 rev/min

  3. Assess the flood resilience tools integration in the landuse projects

    Science.gov (United States)

    Moulin, E.; Deroubaix, J.-F.

    2012-04-01

    Despite a severe regulation concerning the building in flooding areas, 80% of these areas are already built in the Greater Paris (Paris, Val-de-Marne, Hauts-de-Seine and Seine-Saint-Denis). The land use in flooding area is presented as one of the main solutions to solve the ongoing real estate pressure. For instance some of the industrial wastelands located along the river are currently in redevelopment and residential buildings are planned. So the landuse in the flooding areas is currently a key issue in the development of the Greater Paris area. To deal with floods there are some resilience tools, whether structural (such as perimeter barriers or building aperture barriers, etc) or non structural (such as warning systems, etc.). The technical solutions are available and most of the time efficient1. Still, we notice that these tools are not much implemented. The people; stakeholders and inhabitants, literally seems to be not interested. This papers focus on the integration of resilience tools in urban projects. Indeed one of the blockages in the implementation of an efficient flood risk prevention policy is the lack of concern of the landuse stakeholders and the inhabitants for the risk2. We conducted an important number of interviews with stakeholders involved in various urban projects and we assess, in this communication, to what extent the improvement of the resilience to floods is considered as a main issue in the execution of an urban project? How this concern is maintained or could be maintained throughout the project. Is there a dilution of this concern? In order to develop this topic we rely on a case study. The "Ardoines" is a project aiming at redeveloping an industrial site (South-East Paris), into a project including residential and office buildings and other amenities. In order to elaborate the master plan, the urban planning authority brought together some flood risk experts. According to the comments of the experts, the architect in charge of the

  4. Integrating data from the Investigational Medicinal Product Dossier/investigator's brochure. A new tool for translational integration of preclinical effects.

    Science.gov (United States)

    van Gerven, Joop; Cohen, Adam

    2018-01-30

    The first administration of a new compound in humans is an important milestone. A major source of information for the researcher is the investigator's brochure (IB). Such a document, has a size of several hundred pages. The IB should enable investigators or regulators to independently assess the risk-benefit of the proposed trial but the size and complexity makes this difficult. This article offers a practical tool for the integration and subsequent communication of the complex information from the IB or other relevant data sources. This paper is accompanied by an accessible software tool to construct a single page colour-coded overview of preclinical and clinical data. © 2018 The British Pharmacological Society.

  5. Methylation Integration (Mint) | Informatics Technology for Cancer Research (ITCR)

    Science.gov (United States)

    A comprehensive software pipeline and set of Galaxy tools/workflows for integrative analysis of genome-wide DNA methylation and hydroxymethylation data. Data types can be either bisulfite sequencing and/or pull-down methods.

  6. CAsubtype: An R Package to Identify Gene Sets Predictive of Cancer Subtypes and Clinical Outcomes.

    Science.gov (United States)

    Kong, Hualei; Tong, Pan; Zhao, Xiaodong; Sun, Jielin; Li, Hua

    2018-03-01

    In the past decade, molecular classification of cancer has gained high popularity owing to its high predictive power on clinical outcomes as compared with traditional methods commonly used in clinical practice. In particular, using gene expression profiles, recent studies have successfully identified a number of gene sets for the delineation of cancer subtypes that are associated with distinct prognosis. However, identification of such gene sets remains a laborious task due to the lack of tools with flexibility, integration and ease of use. To reduce the burden, we have developed an R package, CAsubtype, to efficiently identify gene sets predictive of cancer subtypes and clinical outcomes. By integrating more than 13,000 annotated gene sets, CAsubtype provides a comprehensive repertoire of candidates for new cancer subtype identification. For easy data access, CAsubtype further includes the gene expression and clinical data of more than 2000 cancer patients from TCGA. CAsubtype first employs principal component analysis to identify gene sets (from user-provided or package-integrated ones) with robust principal components representing significantly large variation between cancer samples. Based on these principal components, CAsubtype visualizes the sample distribution in low-dimensional space for better understanding of the distinction between samples and classifies samples into subgroups with prevalent clustering algorithms. Finally, CAsubtype performs survival analysis to compare the clinical outcomes between the identified subgroups, assessing their clinical value as potentially novel cancer subtypes. In conclusion, CAsubtype is a flexible and well-integrated tool in the R environment to identify gene sets for cancer subtype identification and clinical outcome prediction. Its simple R commands and comprehensive data sets enable efficient examination of the clinical value of any given gene set, thus facilitating hypothesis generating and testing in biological and

  7. EPR design tools. Integrated data processing tools

    International Nuclear Information System (INIS)

    Kern, R.

    1997-01-01

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.) [de

  8. Wilmar Planning Tool, user guide

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  9. Wilmar Planning Tool, user guide

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a short user guide to the Wilmar Planning Tool developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (Contract No. ENK5-CT-2002-00663). A User Shell implemented in an Excel workbook controls the Wilmar Planning Tool. All data are contained in Access databases that communicate with various sub-models through text files that are exported from or imported to the databases. In the User Shell various scenario variables and control parameters are set, and export of model data from the input database, activation of the models, as well as import of model results to the output database are triggered from the shell. (au)

  10. Integration of numerical analysis tools for automated numerical optimization of a transportation package design

    International Nuclear Information System (INIS)

    Witkowski, W.R.; Eldred, M.S.; Harding, D.C.

    1994-01-01

    The use of state-of-the-art numerical analysis tools to determine the optimal design of a radioactive material (RAM) transportation container is investigated. The design of a RAM package's components involves a complex coupling of structural, thermal, and radioactive shielding analyses. The final design must adhere to very strict design constraints. The current technique used by cask designers is uncoupled and involves designing each component separately with respect to its driving constraint. With the use of numerical optimization schemes, the complex couplings can be considered directly, and the performance of the integrated package can be maximized with respect to the analysis conditions. This can lead to more efficient package designs. Thermal and structural accident conditions are analyzed in the shape optimization of a simplified cask design. In this paper, details of the integration of numerical analysis tools, development of a process model, nonsmoothness difficulties with the optimization of the cask, and preliminary results are discussed

  11. BIG: a large-scale data integration tool for renal physiology.

    Science.gov (United States)

    Zhao, Yue; Yang, Chin-Rang; Raghuram, Viswanathan; Parulekar, Jaya; Knepper, Mark A

    2016-10-01

    Due to recent advances in high-throughput techniques, we and others have generated multiple proteomic and transcriptomic databases to describe and quantify gene expression, protein abundance, or cellular signaling on the scale of the whole genome/proteome in kidney cells. The existence of so much data from diverse sources raises the following question: "How can researchers find information efficiently for a given gene product over all of these data sets without searching each data set individually?" This is the type of problem that has motivated the "Big-Data" revolution in Data Science, which has driven progress in fields such as marketing. Here we present an online Big-Data tool called BIG (Biological Information Gatherer) that allows users to submit a single online query to obtain all relevant information from all indexed databases. BIG is accessible at http://big.nhlbi.nih.gov/.

  12. FunGeneNet: a web tool to estimate enrichment of functional interactions in experimental gene sets.

    Science.gov (United States)

    Tiys, Evgeny S; Ivanisenko, Timofey V; Demenkov, Pavel S; Ivanisenko, Vladimir A

    2018-02-09

    Estimation of functional connectivity in gene sets derived from genome-wide or other biological experiments is one of the essential tasks of bioinformatics. A promising approach for solving this problem is to compare gene networks built using experimental gene sets with random networks. One of the resources that make such an analysis possible is CrossTalkZ, which uses the FunCoup database. However, existing methods, including CrossTalkZ, do not take into account individual types of interactions, such as protein/protein interactions, expression regulation, transport regulation, catalytic reactions, etc., but rather work with generalized types characterizing the existence of any connection between network members. We developed the online tool FunGeneNet, which utilizes the ANDSystem and STRING to reconstruct gene networks using experimental gene sets and to estimate their difference from random networks. To compare the reconstructed networks with random ones, the node permutation algorithm implemented in CrossTalkZ was taken as a basis. To study the FunGeneNet applicability, the functional connectivity analysis of networks constructed for gene sets involved in the Gene Ontology biological processes was conducted. We showed that the method sensitivity exceeds 0.8 at a specificity of 0.95. We found that the significance level of the difference between gene networks of biological processes and random networks is determined by the type of connections considered between objects. At the same time, the highest reliability is achieved for the generalized form of connections that takes into account all the individual types of connections. By taking examples of the thyroid cancer networks and the apoptosis network, it is demonstrated that key participants in these processes are involved in the interactions of those types by which these networks differ from random ones. FunGeneNet is a web tool aimed at proving the functionality of networks in a wide range of sizes of

  13. Wilmar Planning Tool, VBA documentation

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Helge V.

    2006-01-15

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  14. Wilmar Planning Tool, VBA documentation

    International Nuclear Information System (INIS)

    Larsen, Helge V.

    2006-01-01

    This is a documentation of the VBA (Visual Basic for Applications) in the Wilmar Planning Tool. VBA is used in the Wilmar User Shell (an Excel workbook) and in the three Access databases that hold input, scenario and output data. The Wilmar Planning Tool is developed in the project Wind Power Integration in Liberalised Electricity Markets (WILMAR) supported by EU (contract ENK5-CT-2002-00663). The User Shell controls the operation of the Wilmar Planning Tool. In the User Shell various control parameters are set, and then a macro in the Input Database is run that writes input files for the Joint market Model and the Long Term Model. Afterwards these models can be started from the User Shell. Finally, the User Shell can start a macro in the Output Database that imports the output files from the models. (LN)

  15. Evaluating quality of patient care communication in integrated care settings: a mixed methods apporach

    NARCIS (Netherlands)

    Gulmans, J.; Gulmans, J.; Vollenbroek-Hutten, Miriam Marie Rosé; van Gemert-Pijnen, Julia E.W.C.; van Harten, Willem H.

    2007-01-01

    Background. Owing to the involvement of multiple professionals from various institutions, integrated care settings are prone to suboptimal patient care communication. To assure continuity, communication gaps should be identified for targeted improvement initiatives. However, available assessment

  16. Decision support tool to evaluate alternative policies regulating wind integration into autonomous energy systems

    International Nuclear Information System (INIS)

    Zouros, N.; Contaxis, G.C.; Kabouris, J.

    2005-01-01

    Integration of wind power into autonomous electricity systems strongly depends on the specific technical characteristics of these systems; the regulations applied should take into account physical system constraints. Introduction of market rules makes the issue even more complicated since the interests of the market participants often conflict each other. In this paper, an integrated tool for the comparative assessment of alternative regulatory policies is presented along with a methodology for decision-making, based on alternative scenarios analysis. The social welfare concept is followed instead of the traditional Least Cost Planning

  17. Integration of ROOT notebook as an ATLAS analysis web-based tool in outreach and public data release projects

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00237353; The ATLAS collaboration

    2017-01-01

    Integration of the ROOT data analysis framework with the Jupyter Notebook technology presents the potential of enhancement and expansion of educational and training programs. It can be beneficial for university students in their early years, new PhD students and post-doctoral researchers, as well as for senior researchers and teachers who want to refresh their data analysis skills or to introduce a more friendly and yet very powerful open source tool in the classroom. Such tools have been already tested in several environments. A fully web-based integration of the tools and the Open Access Data repositories brings the possibility to go a step forward in the ATLAS quest of making use of several CERN projects in the field of the education and training, developing new computing solutions on the way.

  18. NEON's Mobile Deployment Platform: A research tool for integrating ecological processes across scales

    Science.gov (United States)

    Sanclements, M.

    2016-12-01

    Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provides a tool for linking PI led research to the continental scale data sets collected by NEON.

  19. U.S. Geological Survey community for data integration: data upload, registry, and access tool

    Science.gov (United States)

    ,

    2012-01-01

    As a leading science and information agency and in fulfillment of its mission to provide reliable scientific information to describe and understand the Earth, the U.S. Geological Survey (USGS) ensures that all scientific data are effectively hosted, adequately described, and appropriately accessible to scientists, collaborators, and the general public. To succeed in this task, the USGS established the Community for Data Integration (CDI) to address data and information management issues affecting the proficiency of earth science research. Through the CDI, the USGS is providing data and metadata management tools, cyber infrastructure, collaboration tools, and training in support of scientists and technology specialists throughout the project life cycle. One of the significant tools recently created to contribute to this mission is the Uploader tool. This tool allows scientists with limited data management resources to address many of the key aspects of the data life cycle: the ability to protect, preserve, publish and share data. By implementing this application inside ScienceBase, scientists also can take advantage of other collaboration capabilities provided by the ScienceBase platform.

  20. Integrated Data Collection Analysis (IDCA) Program — RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-04

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, for a third and fourth time in the Proficiency Test and averaged with the analysis results from the first and second time. The results, from averaging all four sets (1, 2, 3 and 4) of data suggest a material to have slightly more impact sensitivity, more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity, compared to the results from Set 1, which was used previously as the values for the RDX standard in IDCA Analysis Reports.

  1. A Pragmatic Guide to the Setting up of Integrated Hypnotherapy Services in Primary Care and Clinical Settings.

    Science.gov (United States)

    Entwistle, Paul Andrew

    2017-01-01

    Despite the continued debate and lack of a clear consensus about the true nature of the hypnotic phenomenon, hypnosis is increasingly being utilized successfully in many medical, health, and psychological spheres as a research method, motivational tool, and therapeutic modality. Significantly, however, although hypnotherapy is widely advertised, advocated, and employed in the private medical arena for the management and treatment of many physical and emotional disorders, too little appears to be being done to integrate hypnosis into primary care and national health medical services. This article discusses some of the reasons for the apparent reluctance of medical and scientific health professionals to consider incorporating hypnosis into their medical practice, including the practical problems inherent in using hypnosis in a medical context and some possible solutions.

  2. Analytic tools for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Vladimir A.

    2012-01-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  3. Optically Driven Mobile Integrated Micro-Tools for a Lab-on-a-Chip

    Directory of Open Access Journals (Sweden)

    Yi-Jui Liu

    2013-04-01

    Full Text Available This study proposes an optically driven complex micromachine with an Archimedes microscrew as the mechanical power, a sphere as a coupler, and three knives as the mechanical tools. The micromachine is fabricated by two-photon polymerization and is portably driven by optical tweezers. Because the microscrew can be optically trapped and rotates spontaneously, it provides driving power for the complex micro-tools. In other words, when a laser beam focuses on the micromachine, the microscrew is trapped toward the focus point and simultaneously rotates. A demonstration showed that the integrated micromachines are grasped by the optical tweezers and rotated by the Archimedes screw. The rotation efficiencies of the microrotors with and without knives are 1.9 rpm/mW and 13.5 rpm/mW, respectively. The micromachine can also be portably dragged along planed routes. Such Archimedes screw-based optically driven complex mechanical micro-tools enable rotation similar to moving machines or mixers, which could contribute to applications for a biological microfluidic chip or a lab-on-a-chip.

  4. Climate Change Adaptation Tools at the Community Level: An Integrated Literature Review

    Directory of Open Access Journals (Sweden)

    Elvis Modikela Nkoana

    2018-03-01

    Full Text Available The negative impacts of climate change are experienced at the global, regional and local levels. However, rural communities in sub-Saharan Africa face additional socio-political, cultural and economic challenges in addition to climate change. Decision support tools have been developed and applied to assist rural communities to cope with and adapt to climate change. However, poorly planned participatory processes and the lack of context-specific approaches in these tools are obstacles when aiming at strengthening the resilience of these rural communities. This paper uses an integrated literature review to identify best practices for involving rural communities in climate change adaptation efforts through the application of context-specific and culturally-sensitive climate change adaptation tools. These best practices include the use of a livelihoods approach to engage communities; the explicit acknowledgement of the local cultural do’s and don’ts; the recognition of local champions appointed from within the local community; the identification and prioritisation of vulnerable stakeholders; and the implementation of a two-way climate change risk communication instead of a one-sided information sharing approach.

  5. Multi-site risk-based project planning, optimization, sequencing and budgeting process and tool for the integrated facility disposition project - 59394

    International Nuclear Information System (INIS)

    Nelson, Jerel; Castillo, Carlos; Huntsman, Julie; Lucek, Heather; Marks, Tim

    2012-01-01

    Document available in abstract form only. Full text of publication follows: Faced with the DOE Complex Transformation, NNSA was tasked with developing an integrated plan for the decommissioning of over 400 facilities and 300 environmental remediation units, as well as the many reconfiguration and modernization projects at the Oak Ridge National Laboratory (ORNL) and Y-12 Complex. Manual scheduling of remediation activities is time-consuming, labor intensive, and inherently introduces bias and unaccounted for aspects of the scheduler or organization in the process. Clearly a tool was needed to develop an objective, unbiased baseline optimized project sequence and schedule with a sound technical foundation for the Integrated Facility Disposition Project (IFDP). In generating an integrated disposition schedule, each project (including facilities, environmental sites, and remedial action units) was identified, characterized, then ranked relative to other projects. Risk matrices allowed for core project data to be extrapolated into probable contamination levels, relative risks to the public, and other technical and risk parameters to be used in the development of an overall ranking. These matrices ultimately generated a complete data set that were used in the Ranking and Sequencing Model (RSM), commonly referred to as the SUPER model, for its numerous abilities to support D and D planning, prioritization, and sequencing

  6. Virus-Clip: a fast and memory-efficient viral integration site detection tool at single-base resolution with annotation capability.

    Science.gov (United States)

    Ho, Daniel W H; Sze, Karen M F; Ng, Irene O L

    2015-08-28

    Viral integration into the human genome upon infection is an important risk factor for various human malignancies. We developed viral integration site detection tool called Virus-Clip, which makes use of information extracted from soft-clipped sequencing reads to identify exact positions of human and virus breakpoints of integration events. With initial read alignment to virus reference genome and streamlined procedures, Virus-Clip delivers a simple, fast and memory-efficient solution to viral integration site detection. Moreover, it can also automatically annotate the integration events with the corresponding affected human genes. Virus-Clip has been verified using whole-transcriptome sequencing data and its detection was validated to have satisfactory sensitivity and specificity. Marked advancement in performance was detected, compared to existing tools. It is applicable to versatile types of data including whole-genome sequencing, whole-transcriptome sequencing, and targeted sequencing. Virus-Clip is available at http://web.hku.hk/~dwhho/Virus-Clip.zip.

  7. Delight2 Daylighting Analysis in Energy Plus: Integration and Preliminary User Results

    Energy Technology Data Exchange (ETDEWEB)

    Carroll, William L.; Hitchcock, Robert J.

    2005-04-26

    DElight is a simulation engine for daylight and electric lighting system analysis in buildings. DElight calculates interior illuminance levels from daylight, and the subsequent contribution required from electric lighting to meet a desired interior illuminance. DElight has been specifically designed to integrate with building thermal simulation tools. This paper updates the DElight capability set, the status of integration into the simulation tool EnergyPlus, and describes a sample analysis of a simple model from the user perspective.

  8. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    LENUS (Irish Health Repository)

    Hennerby, Cathy

    2012-02-01

    AIM: This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. BACKGROUND: The increased number of registered general agency nurses working in an acute children\\'s hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about \\'near misses\\

  9. Use of a tool-set by Pan troglodytes troglodytes to obtain termites (Macrotermes) in the periphery of the Dja Biosphere Reserve, southeast Cameroon.

    Science.gov (United States)

    Deblauwe, Isra; Guislain, Patrick; Dupain, Jef; Van Elsacker, Linda

    2006-12-01

    At the northern periphery of the Dja Biosphere Reserve (southeastern Cameroon) we recorded a new use of a tool-set by Pan troglodytes troglodytes to prey on Macrotermes muelleri, M. renouxi, M. lilljeborgi, and M. nobilis. We recovered 79 puncturing sticks and 47 fishing probes at 17 termite nests between 2002 and 2005. The mean length of the puncturing sticks (n = 77) and fishing probes (n = 45) was 52 cm and 56 cm, respectively, and the mean diameter was 9 mm and 4.5 mm, respectively. Sixty-eight percent of 138 chimpanzee fecal samples contained major soldiers of four Macrotermes species. The chimpanzees in southeastern Cameroon appeared to be selective in their choice of plant material to make their tools. The tools found at our study site resemble those from other sites in this region. However, in southeastern Cameroon only one tool-set type was found, whereas two tool-set types have been reported in Congo. Our study suggests that, along with the different vegetation types and the availability of plant material around termite nests, the nest and gallery structure and foraging behavior of the different Macrotermes spp. at all Central African sites must be investigated before we can attribute differences in tool-use behavior to culture. (c) 2006 Wiley-Liss, Inc.

  10. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    Moon, Kwon-Ki; Kim, Do-Yeon; Chang, Hoon-Seon; Chang, Young-Woo; Yun, Jae-Hee; Park, Jee-Duck; Kim, Jae-Hack

    2006-01-01

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  11. Analytic tools for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Vladimir A. [Moscow State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2012-07-01

    Most powerful methods of evaluating Feynman integrals are presented. Reader will be able to apply them in practice. Contains numerous examples. The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice. This book supersedes the author's previous Springer book ''Evaluating Feynman Integrals'' and its textbook version ''Feynman Integral Calculus.'' Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added: One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, ''Applied Asymptotic Expansions in Momenta and Masses,'' by the author. This chapter describes, on the basis of papers that appeared after the publication of said book, how to algorithmically discover the regions relevant to a given limit within the strategy of expansion by regions. In addition, the chapters on the method of Mellin-Barnes representation and on the method of integration by parts have been substantially rewritten, with an emphasis on the corresponding algorithms and computer codes.

  12. Mixed-Dimensionality VLSI-Type Configurable Tools for Virtual Prototyping of Biomicrofluidic Devices and Integrated Systems

    Science.gov (United States)

    Makhijani, Vinod B.; Przekwas, Andrzej J.

    2002-10-01

    This report presents results of a DARPA/MTO Composite CAD Project aimed to develop a comprehensive microsystem CAD environment, CFD-ACE+ Multiphysics, for bio and microfluidic devices and complete microsystems. The project began in July 1998, and was a three-year team effort between CFD Research Corporation, California Institute of Technology (CalTech), University of California, Berkeley (UCB), and Tanner Research, with Mr. Don Verlee from Abbott Labs participating as a consultant on the project. The overall objective of this project was to develop, validate and demonstrate several applications of a user-configurable VLSI-type mixed-dimensionality software tool for design of biomicrofluidics devices and integrated systems. The developed tool would provide high fidelity 3-D multiphysics modeling capability, l-D fluidic circuits modeling, and SPICE interface for system level simulations, and mixed-dimensionality design. It would combine tools for layouts and process fabrication, geometric modeling, and automated grid generation, and interfaces to EDA tools (e.g. Cadence) and MCAD tools (e.g. ProE).

  13. The need for integration of drought monitoring tools for proactive food security management in sub-Saharan Africa

    Science.gov (United States)

    Tadesse, T.; Haile, M.; Senay, G.; Wardlow, B.D.; Knutson, C.L.

    2008-01-01

    Reducing the impact of drought and famine remains a challenge in sub-Saharan Africa despite ongoing drought relief assistance in recent decades. This is because drought and famine are primarily addressed through a crisis management approach when a disaster occurs, rather than stressing preparedness and risk management. Moreover, drought planning and food security efforts have been hampered by a lack of integrated drought monitoring tools, inadequate early warning systems (EWS), and insufficient information flow within and between levels of government in many sub-Saharan countries. The integration of existing drought monitoring tools for sub-Saharan Africa is essential for improving food security systems to reduce the impacts of drought and famine on society in this region. A proactive approach emphasizing integration requires the collective use of multiple tools, which can be used to detect trends in food availability and provide early indicators at local, national, and regional scales on the likely occurrence of food crises. In addition, improving the ability to monitor and disseminate critical drought-related information using available modern technologies (e.g., satellites, computers, and modern communication techniques) may help trigger timely and appropriate preventive responses and, ultimately, contribute to food security and sustainable development in sub-Saharan Africa. ?? 2008 United Nations.

  14. Tools for the Validation of Genomes and Transcriptomes with Proteomics data

    DEFF Research Database (Denmark)

    Pang, Chi Nam Ignatius; Aya, Carlos; Tay, Aidan

    data generated from protein mass spectrometry. We are developing a set of tools which allow users to: •Co-visualise genomics, transcriptomics, and proteomics data using the Integrated Genomics Viewer (IGV).1 •Validate the existence of genes and mRNAs using peptides identified from mass spectrometry...

  15. Integrated digital planning and management tools - the example of buildings; Integrierte digitale Planungs- und Management-Werkzeuge - am Beispiel von Gebaeuden

    Energy Technology Data Exchange (ETDEWEB)

    Gauchel, J. [WIBc, Objektorientierte Systeme fuer Gebaeudeplanung und -management, Karlsruhe (Germany)

    1995-12-31

    In chapter 4 of the anthology about building control integrated digital planning and management tools are described. Integrated planning and management tools are task-orientated data processing applications, which are integrated into a data management and a common user-interface. The following aspects are discussed: realistic modelling, CAD-systems, object-orientated system architecture as well as building management. (BWI) [Deutsch] Kapitel 4 des Sammelbandes ueber Building Control ist Integrierten digitalen Planungs- und Managementwerkzeugen gewindmet. Unter Integrierten Planungs- und Managementwerkzeugen werden dabei aufgabenorientierte EDV-Anwendungen verstanden, die in eine gemeinsame Datenverwaltung und ein gemeinsames Nutzer-Interface eingebunden sind. Folgende Themen werden angesprochen: Realistische Modellierung, CAD-Systeme, objektorientierte Systemarchitekturen sowie Gebaeudemanagement. (BWI)

  16. Systematic Optimization-Based Integrated Chemical Product–Process Design Framework

    DEFF Research Database (Denmark)

    Cignitti, Stefano; Mansouri, Seyed Soheil; Woodley, John M.

    2018-01-01

    An integrated optimization-based framework for product and process design is proposed. The framework uses a set of methods and tools to obtain the optimal product–process design solution given a set of economic and environmental sustainability targets. The methods and tools required are property...... of the framework is demonstrated through three case studies: (i) refrigeration cycle unit for R134a replacement, (ii) a mixed working fluid design problem for R134a replacement, and (iii) pure solvent design for water-acetic acid LLE extraction. Through the application of the framework it is demonstrated that all...... prediction through group contributions, unless supported with a database, computer-aided molecular and mixture/blend design for generation of novel as well as existing products and mathematical programming for formulating and solving multiscale integrated process–product design problems. The application...

  17. Google Sets, Google Suggest, and Google Search History: Three More Tools for the Reference Librarian's Bag of Tricks

    OpenAIRE

    Cirasella, Jill

    2008-01-01

    This article examines the features, quirks, and uses of Google Sets, Google Suggest, and Google Search History and argues that these three lesser-known Google tools warrant inclusion in the resourceful reference librarian’s bag of tricks.

  18. Towards an integrated petrophysical tool for multiphase flow properties of core samples

    Energy Technology Data Exchange (ETDEWEB)

    Lenormand, R. [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    This paper describes the first use of an Integrated Petrophysical Tool (IPT) on reservoir rock samples. The IPT simultaneously measures the following petrophysical properties: (1) Complete capillary pressure cycle: primary drainage, spontaneous and forced imbibitions, secondary drainage (the cycle leads to the wettability of the core by using the USBM index); End-points and parts of the relative permeability curves; Formation factor and resistivity index. The IPT is based on the steady-state injection of one fluid through the sample placed in a Hassler cell. The experiment leading to the whole Pc cycle on two reservoir sandstones consists of about 30 steps at various oil or water flow rates. It takes about four weeks and is operated at room conditions. Relative permeabilities are in line with standard steady-state measurements. Capillary pressures are in accordance with standard centrifuge measurements. There is no comparison for the resistivity index, but the results are in agreement with literature data. However, the accurate determination of saturation remains the main difficulty and some improvements are proposed. In conclusion, the Integrated Petrophysical Tool is as accurate as standard methods and has the advantage of providing the various parameters on the same sample and during a single experiment. The FIT is easy to use and can be automated. In addition, it can be operated in reservoir conditions.

  19. Regional Energy Planning Tool for Renewable Integrated Low-Energy District Heating Systems

    DEFF Research Database (Denmark)

    Tol, Hakan; Dincer, Ibrahim; Svendsen, Svend

    2013-01-01

    Low-energy district heating systems, operating at low temperature of 55 °C as supply and 25°C as return, can be the energy solution as being the prevailing heating infrastructure in urban areas, considering future energy schemesaiming at increased exploitation of renewable energy sources together...... with low-energy houses in focus with intensified energy efficiency measures. Employing low-temperature operation allows the ease to exploit not only any type of heat source but also low-grade sources, i.e., renewable and industrial waste heat, which would otherwise be lost. In this chapter, a regional...... energy planning tool is described considered with various energy conversion systems based on renewable energy sources to be supplied to an integrated energy infrastructure involving a low-energy district heating, a district cooling, and an electricity grid. The developed tool is performed for two case...

  20. Weight Estimation Tool for Children Aged 6 to 59 Months in Limited-Resource Settings.

    Science.gov (United States)

    Ralston, Mark E; Myatt, Mark A

    2016-01-01

    A simple, reliable anthropometric tool for rapid estimation of weight in children would be useful in limited-resource settings where current weight estimation tools are not uniformly reliable, nearly all global under-five mortality occurs, severe acute malnutrition is a significant contributor in approximately one-third of under-five mortality, and a weight scale may not be immediately available in emergencies to first-response providers. To determine the accuracy and precision of mid-upper arm circumference (MUAC) and height as weight estimation tools in children under five years of age in low-to-middle income countries. This was a retrospective observational study. Data were collected in 560 nutritional surveys during 1992-2006 using a modified Expanded Program of Immunization two-stage cluster sample design. Locations with high prevalence of acute and chronic malnutrition. A total of 453,990 children met inclusion criteria (age 6-59 months; weight ≤ 25 kg; MUAC 80-200 mm) and exclusion criteria (bilateral pitting edema; biologically implausible weight-for-height z-score (WHZ), weight-for-age z-score (WAZ), and height-for-age z-score (HAZ) values). Weight was estimated using Broselow Tape, Hong Kong formula, and database MUAC alone, height alone, and height and MUAC combined. Mean percentage difference between true and estimated weight, proportion of estimates accurate to within ± 25% and ± 10% of true weight, weighted Kappa statistic, and Bland-Altman bias were reported as measures of tool accuracy. Standard deviation of mean percentage difference and Bland-Altman 95% limits of agreement were reported as measures of tool precision. Database height was a more accurate and precise predictor of weight compared to Broselow Tape 2007 [B], Broselow Tape 2011 [A], and MUAC. Mean percentage difference between true and estimated weight was +0.49% (SD = 10.33%); proportion of estimates accurate to within ± 25% of true weight was 97.36% (95% CI 97.40%, 97.46%); and

  1. Using a blog as an integrated eLearning tool and platform.

    Science.gov (United States)

    Goh, Poh Sun

    2016-06-01

    Technology enhanced learning or eLearning allows educators to expand access to educational content, promotes engagement with students and makes it easier for students to access educational material at a time, place and pace which suits them. The challenge for educators beginning their eLearning journey is to decide where to start, which includes the choice of an eLearning tool and platform. This article will share one educator's decision making process, and experience using blogs as a flexible and versatile integrated eLearning tool and platform. Apart from being a cost effective/free tool and platform, blogs offer the possibility of creating a hyperlinked indexed content repository, for both created and curated educational material; as well as a distribution and engagement tool and platform. Incorporating pedagogically sound activities and educational practices into a blog promote a structured templated teaching process, which can be reproduced. Moving from undergraduate to postgraduate training, educational blogs supported by a comprehensive online case-based repository offer the possibility of training beyond competency towards proficiency and expert level performance through a process of deliberate practice. By documenting educational content and the student engagement and learning process, as well as feedback and personal reflection of educational sessions, blogs can also form the basis for a teaching portfolio, and provide evidence and data of scholarly teaching and educational scholarship. Looking into the future, having a collection of readily accessible indexed hyperlinked teaching material offers the potential to do on the spot teaching with illustrative material called up onto smart surfaces, and displayed on holographic interfaces.

  2. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    Science.gov (United States)

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  3. Interaction Between the Environment and Animals in Urban Settings: Integrated and Participatory Planning

    Science.gov (United States)

    Tarsitano, Elvira

    2006-11-01

    In urban ecosystems, the ecological system has become completely unbalanced; this, in turn, has led to an increase in well-known problems such as air pollution, ground pollution, and water pollution. This imbalance has also led to the growth and spread of pathogens harmful to man, animals, and plants. Urban sustainability indicators, both global and local, also “indicate” the percentage of population, but these refer only to the human population, not the animal population. Cities need good waste, water, and air management, effective traffic planning, and good zoning of businesses, crafts, and services; over and above these activities, cities also need for planning to take into account the existence of pets (dogs, cats, and etc.) and nonpet animals (insects, birds, mice, etc.). Cities tend to be designed around humans and “on a human scale,” without taking into account the fact that a huge animal population is living side by side with people. That explains why overcrowding tends to go hand in hand with urbanization; all these populations, including humans, need to adapt to new spaces and often need to drastically change their behavior. This is a fact that must be included when drafting sustainable city plans. The supposed strategy is that of “integrated-participatory” control of the interactions between the environment and animals in the cities. Strategy will focus on the development of integrated approaches and tools for environment and animal management in the context of urban settings. This will require such specific methods as ecological balance sheets and ecoplans for the planning, management, and control of the interrelation among environment, animal, and public health. The objective is to develop a better understanding of urban biodiversity and of urban ecosystem functioning, in order to understand and minimize the negative impacts of human activities on them. The research will focus on assessing and forecasting changes in urban biodiversity

  4. Integrated Berth Allocation and Quay Crane Assignment Problem: Set partitioning models and computational results

    DEFF Research Database (Denmark)

    Iris, Cagatay; Pacino, Dario; Røpke, Stefan

    2015-01-01

    Most of the operational problems in container terminals are strongly interconnected. In this paper, we study the integrated Berth Allocation and Quay Crane Assignment Problem in seaport container terminals. We will extend the current state-of-the-art by proposing novel set partitioning models....... To improve the performance of the set partitioning formulations, a number of variable reduction techniques are proposed. Furthermore, we analyze the effects of different discretization schemes and the impact of using a time-variant/invariant quay crane allocation policy. Computational experiments show...

  5. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... distributed environment. In this paper, we argue the need to have a cloud-enabled platform for supporting GSD and propose reference architecture of a cloud based Platform for providing support to provision ecosystem of the Tools as a Service (PTaaS)....

  6. Implementation of a competency assessment tool for agency nurses working in an acute paediatric setting.

    Science.gov (United States)

    Hennerby, Cathy; Joyce, Pauline

    2011-03-01

    This paper reports on the implementation of a competency assessment tool for registered general agency nurses working in an acute paediatric setting, using a change management framework. The increased number of registered general agency nurses working in an acute children's hospital alerted concerns around their competency in working with children. These concerns were initially raised via informal complaints about 'near misses', parental dissatisfaction, perceived competency weaknesses and rising cost associated with their use. [Young's (2009) Journal of Organisational Change, 22, 524-548] nine-stage change framework was used to guide the implementation of the competency assessment tool within a paediatric acute care setting. The ongoing success of the initiative, from a nurse manager's perspective, relies on structured communication with the agency provider before employing competent agency nurses. Sustainability of the change will depend on nurse managers' persistence in attending the concerns of those resisting the change while simultaneously supporting those championing the change. These key communication and supporting roles highlight the pivotal role held by nurse managers, as gate keepers, in safe-guarding children while in hospital. Leadership qualities of nurse managers will also be challenged in continuing to manage and drive the change where resistance might prevail. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  7. An Innovative Model of Integrated Behavioral Health: School Psychologists in Pediatric Primary Care Settings

    Science.gov (United States)

    Adams, Carolyn D.; Hinojosa, Sara; Armstrong, Kathleen; Takagishi, Jennifer; Dabrow, Sharon

    2016-01-01

    This article discusses an innovative example of integrated care in which doctoral level school psychology interns and residents worked alongside pediatric residents and pediatricians in the primary care settings to jointly provide services to patients. School psychologists specializing in pediatric health are uniquely trained to recognize and…

  8. Developing Indicators for a Classroom Observation Tool on Pedagogy and Technology Integration: A Delphi Study

    Science.gov (United States)

    Elmendorf, Douglas C.; Song, Liyan

    2015-01-01

    Rapid advances in technology and increased access to technology tools have created new instructional demands and expectations on teachers. Due to the ubiquitous presence of technology in K-12 schools, teachers are being observed on both their pedagogical and technology integration practices. Applying the technological pedagogical and content…

  9. Integration at the Round Table: Marine Spatial Planning in Multi-Stakeholder Settings

    OpenAIRE

    Olsen, Erik; Fluharty, David; Hoel, Alf Håkon; Hostens, Kristian; Maes, Frank; Pecceu, Ellen

    2014-01-01

    Marine spatial planning (MSP) is often considered as a pragmatic approach to implement an ecosystem based management in order to manage marine space in a sustainable way. This requires the involvement of multiple actors and stakeholders at various governmental and societal levels. Several factors affect how well the integrated management of marine waters will be achieved, such as different governance settings (division of power between central and local governments), economic activities (and ...

  10. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  11. Tool Indicates Contact Angles In Bearing Raceways

    Science.gov (United States)

    Akian, Richard A.; Butner, Myles F.

    1995-01-01

    Tool devised for use in measuring contact angles between balls and races in previously operated ball bearings. Used on both inner and outer raceways of bearings having cross-sectional widths between approximately 0.5 and 2.0 in. Consists of integral protractor mounted in vertical plane on bracket equipped with leveling screws and circular level indicator. Protractor includes rotatable indicator needle and set of disks of various sizes to fit various raceway curvatures.

  12. Setting clear expectations for safety basis development

    International Nuclear Information System (INIS)

    MORENO, M.R.

    2003-01-01

    DOE-RL has set clear expectations for a cost-effective approach for achieving compliance with the Nuclear Safety Management requirements (10 CFR 830, Nuclear Safety Rule) which will ensure long-term benefit to Hanford. To facilitate implementation of these expectations, tools were developed to streamline and standardize safety analysis and safety document development resulting in a shorter and more predictable DOE approval cycle. A Hanford Safety Analysis and Risk Assessment Handbook (SARAH) was issued to standardized methodologies for development of safety analyses. A Microsoft Excel spreadsheet (RADIDOSE) was issued for the evaluation of radiological consequences for accident scenarios often postulated for Hanford. A standard Site Documented Safety Analysis (DSA) detailing the safety management programs was issued for use as a means of compliance with a majority of 3009 Standard chapters. An in-process review was developed between DOE and the Contractor to facilitate DOE approval and provide early course correction. As a result of setting expectations and providing safety analysis tools, the four Hanford Site waste management nuclear facilities were able to integrate into one Master Waste Management Documented Safety Analysis (WM-DSA)

  13. SAGES: a suite of freely-available software tools for electronic disease surveillance in resource-limited settings.

    Directory of Open Access Journals (Sweden)

    Sheri L Lewis

    Full Text Available Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  14. Imprecision and uncertainty in information representation and processing new tools based on intuitionistic fuzzy sets and generalized nets

    CERN Document Server

    Sotirov, Sotir

    2016-01-01

    The book offers a comprehensive and timely overview of advanced mathematical tools for both uncertainty analysis and modeling of parallel processes, with a special emphasis on intuitionistic fuzzy sets and generalized nets. The different chapters, written by active researchers in their respective areas, are structured to provide a coherent picture of this interdisciplinary yet still evolving field of science. They describe key tools and give practical insights into and research perspectives on the use of Atanassov's intuitionistic fuzzy sets and logic, and generalized nets for describing and dealing with uncertainty in different areas of science, technology and business, in a single, to date unique book. Here, readers find theoretical chapters, dealing with intuitionistic fuzzy operators, membership functions and algorithms, among other topics, as well as application-oriented chapters, reporting on the implementation of methods and relevant case studies in management science, the IT industry, medicine and/or ...

  15. A Standardized Needs Assessment Tool to Inform the Curriculum Development Process for Pediatric Resuscitation Simulation-Based Education in Resource-Limited Settings

    Directory of Open Access Journals (Sweden)

    Nicole Shilkofski

    2018-02-01

    Full Text Available IntroductionUnder five mortality rates (UFMR remain high for children in low- and middle-income countries (LMICs in the developing world. Education for practitioners in these environments is a key factor to improve outcomes that will address United Nations Sustainable Development Goals 3 and 10 (good health and well being and reduced inequalities. In order to appropriately contextualize a curriculum using simulation, it is necessary to first conduct a needs assessment of the target learner population. The World Health Organization (WHO has published a tool to assess capacity for emergency and surgical care in LMICs that is adaptable to this goal.Materials and methodsThe WHO Tool for Situational Analysis to Assess Emergency and Essential Surgical Care was modified to assess pediatric resuscitation capacity in clinical settings in two LMICs: Uganda and Myanmar. Modifications included assessment of self-identified learning needs, current practices, and perceived epidemiology of disease burden in each clinical setting, in addition to assessment of pediatric resuscitation capacity in regard to infrastructure, procedures, equipment, and supplies. The modified tool was administered to 94 respondents from the two settings who were target learners of a proposed simulation-based curriculum in pediatric and neonatal resuscitation.ResultsInfectious diseases (respiratory illnesses and diarrheal disease were cited as the most common causes of pediatric deaths in both countries. Self-identified learning needs included knowledge and skill development in pediatric airway/breathing topics, as well as general resuscitation topics such as CPR and fluid resuscitation in shock. Equipment and supply availability varied substantially between settings, and critical shortages were identified in each setting. Current practices and procedures were often limited by equipment availability or infrastructural considerations.Discussion and conclusionEpidemiology of disease

  16. Teaching Process Design through Integrated Process Synthesis

    Science.gov (United States)

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  17. The Virtual UNICOS Process Expert: integration of Artificial Intelligence tools in Control Systems

    CERN Multimedia

    Vilches Calvo, I; Barillere, R

    2009-01-01

    UNICOS is a CERN framework to produce control applications. It provides operators with ways to interact with all process items from the most simple (e.g. I/O channels) to the most abstract objects (e.g. a part of the plant). This possibility of fine grain operation is particularly useful to recover from abnormal situations if operators have the required knowledge. The Virtual UNICOS Process Expert project aims at providing operators with means to handle difficult operation cases for which the intervention of process experts is usually requested. The main idea of project is to use the openness of the UNICOS-based applications to integrate tools (e.g. Artificial Intelligence tools) which will act as Process Experts to analyze complex situations, to propose and to execute smooth recovery procedures.

  18. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  19. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  20. Decision support and data warehousing tools boost competitive advantage.

    Science.gov (United States)

    Waldo, B H

    1998-01-01

    The ability to communicate across the care continuum is fast becoming an integral component of the successful health enterprise. As integrated delivery systems are formed and patient care delivery is restructured, health care professionals must be able to distribute, access, and evaluate information across departments and care settings. The Aberdeen Group, a computer and communications research and consulting organization, believes that "the single biggest challenge for next-generation health care providers is to improve on how they consolidate and manage information across the continuum of care. This involves building a strategic warehouse of clinical and financial information that can be shared and leveraged by health care professionals, regardless of the location or type of care setting" (Aberdeen Group, Inc., 1997). The value and importance of data and systems integration are growing. Organizations that create a strategy and implement DSS tools to provide decision-makers with the critical information they need to face the competition and maintain quality and costs will have the advantage.

  1. Piloting a programme tool to evaluate malaria case investigation and reactive case detection activities: results from 3 settings in the Asia Pacific.

    Science.gov (United States)

    Cotter, Chris; Sudathip, Prayuth; Herdiana, Herdiana; Cao, Yuanyuan; Liu, Yaobao; Luo, Alex; Ranasinghe, Neil; Bennett, Adam; Cao, Jun; Gosling, Roly D

    2017-08-22

    Case investigation and reactive case detection (RACD) activities are widely-used in low transmission settings to determine the suspected origin of infection and identify and treat malaria infections nearby to the index patient household. Case investigation and RACD activities are time and resource intensive, include methodologies that vary across eliminating settings, and have no standardized metrics or tools available to monitor and evaluate them. In response to this gap, a simple programme tool was developed for monitoring and evaluating (M&E) RACD activities and piloted by national malaria programmes. During the development phase, four modules of the RACD M&E tool were created to assess and evaluate key case investigation and RACD activities and costs. A pilot phase was then carried out by programme implementers between 2013 and 2015, during which malaria surveillance teams in three different settings (China, Indonesia, Thailand) piloted the tool over a period of 3 months each. This study describes summary results of the pilots and feasibility and impact of the tool on programmes. All three study areas implemented the RACD M&E tool modules, and pilot users reported the tool and evaluation process were helpful to identify gaps in RACD programme activities. In the 45 health facilities evaluated, 71.8% (97/135; min 35.3-max 100.0%) of the proper notification and reporting forms and 20.0% (27/135; min 0.0-max 100.0%) of standard operating procedures (SOPs) were available to support malaria elimination activities. The tool highlighted gaps in reporting key data indicators on the completeness for malaria case reporting (98.8%; min 93.3-max 100.0%), case investigations (65.6%; min 61.8-max 78.4%) and RACD activities (70.0%; min 64.7-max 100.0%). Evaluation of the SOPs showed that knowledge and practices of malaria personnel varied within and between study areas. Average monthly costs for conducting case investigation and RACD activities showed variation between study

  2. Integrating the nursing management minimum data set into the logical observation identifier names and codes system.

    Science.gov (United States)

    Subramanian, Amarnath; Westra, Bonnie; Matney, Susan; Wilson, Patricia S; Delaney, Connie W; Huff, Stan; Huff, Stanley M; Huber, Diane

    2008-11-06

    This poster describes the process used to integrate the Nursing Management Minimum Data Set (NMMDS), an instrument to measure the nursing context of care, into the Logical Observation Identifier Names and Codes (LOINC) system to facilitate contextualization of quality measures. Integration of the first three of 18 elements resulted in 48 new codes including five panels. The LOINC Clinical Committee has approved the presented mapping for their next release.

  3. ATD-1 ATM Technology Demonstration-1 and Integrated Scheduling

    Science.gov (United States)

    Quon, Leighton

    2014-01-01

    Enabling efficient arrivals for the NextGen Air Traffic Management System and developing a set of integrated decision support tools to reduce the high cognitive workload so that controllers are able to simultaneously achieve safe, efficient, and expedient operations at high traffic demand levels.

  4. Onwards and upwards: the evolution of integrated UAV solutions

    Directory of Open Access Journals (Sweden)

    Jean-Christophe Zufferey

    2017-12-01

    Tools need to be integrated, complete solutions, marking a move away from users seeing drones in isolation. Throughout 2018, this investment in end-to-end is set to address businesses’ key operational challenges, deliver a strong return-on-investment and streamline adherence to emerging regulations.

  5. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    International Nuclear Information System (INIS)

    Cai, Yindi; Chen, Yuan-Liu; Shimizu, Yuki; Ito, So; Gao, Wei; Zhang, Liangchi

    2016-01-01

    Highlights: • Subnanometric contact between a diamond tool and a copper workpiece surface is investigated by MD simulation. • A multi-relaxation time technique is proposed to eliminate the influence of the atom vibrations. • The accuracy of the elastic-plastic transition contact depth estimation is improved by observing the residual defects. • The simulation results are beneficial for optimization of the next-generation microcutting instruments. - Abstract: This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  6. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  7. State-of-the-art Review : Vol. 2B. Methods and Tools for Designing Integrated Building Concepts

    DEFF Research Database (Denmark)

    van der Aa, Ad; Andresen, Inger; Asada, Hideo

    of integrated building concepts and responsive building elements. At last, the report gives a description of uncertainty modelling in building performance assessment. The descriptions of the design methods and tools include an explanation of how the methods may be applied, any experiences gained by using...

  8. Integrating Web-Based Teaching Tools into Large University Physics Courses

    Science.gov (United States)

    Toback, David; Mershin, Andreas; Novikova, Irina

    2005-12-01

    Teaching students in our large, introductory, calculus-based physics courses to be good problem-solvers is a difficult task. Not only must students be taught to understand and use the physics concepts in a problem, they must become adept at turning the physical quantities into symbolic variables, translating the problem into equations, and "turning the crank" on the mathematics to find both a closed-form solution and a numerical answer. Physics education research has shown that students' poor math skills and instructors' lack of pen-and-paper homework grading resources, two problems we face at our institution, can have a significant impact on problem-solving skill development.2-4 While Interactive Engagement methods appear to be the preferred mode of instruction,5 for practical reasons we have not been able to widely implement them. In this paper, we describe three Internet-based "teaching-while-quizzing" tools we have developed and how they have been integrated into our traditional lecture course in powerful but easy to incorporate ways.6 These are designed to remediate students' math deficiencies, automate homework grading, and guide study time toward problem solving. Our intent is for instructors who face similar obstacles to adopt these tools, which are available upon request.7

  9. INTEGRATED SFM TECHNIQUES USING DATA SET FROM GOOGLE EARTH 3D MODEL AND FROM STREET LEVEL

    Directory of Open Access Journals (Sweden)

    L. Inzerillo

    2017-08-01

    Full Text Available Structure from motion (SfM represents a widespread photogrammetric method that uses the photogrammetric rules to carry out a 3D model from a photo data set collection. Some complex ancient buildings, such as Cathedrals, or Theatres, or Castles, etc. need to implement the data set (realized from street level with the UAV one in order to have the 3D roof reconstruction. Nevertheless, the use of UAV is strong limited from the government rules. In these last years, Google Earth (GE has been enriched with the 3D models of the earth sites. For this reason, it seemed convenient to start to test the potentiality offered by GE in order to extract from it a data set that replace the UAV function, to close the aerial building data set, using screen images of high resolution 3D models. Users can take unlimited “aerial photos” of a scene while flying around in GE at any viewing angle and altitude. The challenge is to verify the metric reliability of the SfM model carried out with an integrated data set (the one from street level and the one from GE aimed at replace the UAV use in urban contest. This model is called integrated GE SfM model (i-GESfM. In this paper will be present a case study: the Cathedral of Palermo.

  10. Value-based integrated (renal) care: setting a development agenda for research and implementation strategies.

    Science.gov (United States)

    Valentijn, Pim P; Biermann, Claus; Bruijnzeels, Marc A

    2016-08-02

    Integrated care services are considered a vital strategy for improving the Triple Aim values for people with chronic kidney disease. However, a solid scholarly explanation of how to develop, implement and evaluate such value-based integrated renal care services is limited. The aim of this study was to develop a framework to identify the strategies and outcomes for the implementation of value-based integrated renal care. First, the theoretical foundations of the Rainbow Model of Integrated Care and the Triple Aim were united into one overarching framework through an iterative process of key-informant consultations. Second, a rapid review approach was conducted to identify the published research on integrated renal care, and the Cochrane Library, Medline, Scopus, and Business Source Premier databases were searched for pertinent articles published between 2000 and 2015. Based on the framework, a coding schema was developed to synthesis the included articles. The overarching framework distinguishes the integrated care domains: 1) type of integration, 2) enablers of integration and the interrelated outcome domains, 3) experience of care, 4) population health and 5) costs. The literature synthesis indicated that integrated renal care implementation strategies have particularly focused on micro clinical processes and physical outcomes, while little emphasis has been placed on meso organisational as well as macro system integration processes. In addition, evidence regarding patients' perceived outcomes and economic outcomes has been weak. These results underscore that the future challenge for researchers is to explore which integrated care implementation strategies achieve better health and improved experience of care at a lower cost within a specific context. For this purpose, this study's framework and evidence synthesis have set a developmental agenda for both integrated renal care practice and research. Accordingly, we plan further work to develop an implementation

  11. Development of an integrated knowledge-base and its management tool for computerized alarm processing system

    International Nuclear Information System (INIS)

    Heo, Gyun Young; Choi, Seong Soo; Kim, Han Gon; Chang, Soon Heung

    1997-01-01

    For a long time, a number of alarm processing techniques have been researched to reduce the number of actuated alarms for operators to deal with effectively during the abnormal as well as the normal conditions. However, the strategy that the only systems with a few clear technologies should be used as a part of an alarm annunciation system has been adopted considering the effectiveness and the reliability in actual alarm processing systems. Therefore, alarm processing systems have difficult knowledge-base maintenance problems and limited expansion or enhancement defects. To solve these shortcomings, the integrated knowledge-base which can express the general information related to all the alarm processing techniques is proposed and its management tool, Knowledge Input Tool for Alarm (KIT-A) which can handle the data of the knowledge-base efficiently is developed. Since the integrated knowledge-base with KIT-A can manipulate all the alarm information without the modification of alarm processing system itself, it is expected to considerably advance the overall capability of maintenance and enhancement of the alarm processing systems

  12. Systematic Review: Concept and Tool Development with Application in the Integrated Risk Information System (IRIS) Assessment Process

    Science.gov (United States)

    Systematic Review: Concept and tool development with application to the National Toxicology Program (NTP) and the Integrated Risk Information System (IRIS) Assessment Processes. There is growing interest within the environmental health community to incorporate systematic review m...

  13. Gaussian process regression for tool wear prediction

    Science.gov (United States)

    Kong, Dongdong; Chen, Yongjie; Li, Ning

    2018-05-01

    To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.

  14. Design and control of integrated chromatography column sequences.

    Science.gov (United States)

    Andersson, Niklas; Löfgren, Anton; Olofsson, Marianne; Sellberg, Anton; Nilsson, Bernt; Tiainen, Peter

    2017-07-01

    To increase the productivity in biopharmaceutical production, a natural step is to introduce integrated continuous biomanufacturing which leads to fewer buffer and storage tanks, smaller sizes of integrated unit operations, and full automation of the operation. The main contribution of this work is to illustrate a methodology for design and control of a downstream process based on integrated column sequences. For small scale production, for example, pre-clinical studies, integrated column sequences can be implemented on a single chromatography system. This makes for a very efficient drug development platform. The proposed methodology is composed of four steps and is governed by a set of tools, that is presented, that makes the transition from batch separations to a complete integrated separation sequence as easy as possible. This methodology, its associated tools and the physical implementation is presented and illustrated on a case study where the target protein is separated from impurities through an integrated four column sequence. This article shows that the design and control of an integrated column sequence was successfully implemented for a tertiary protein separation problem. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:923-930, 2017. © 2017 American Institute of Chemical Engineers.

  15. oPOSSUM: integrated tools for analysis of regulatory motif over-representation

    Science.gov (United States)

    Ho Sui, Shannan J.; Fulton, Debra L.; Arenillas, David J.; Kwon, Andrew T.; Wasserman, Wyeth W.

    2007-01-01

    The identification of over-represented transcription factor binding sites from sets of co-expressed genes provides insights into the mechanisms of regulation for diverse biological contexts. oPOSSUM, an internet-based system for such studies of regulation, has been improved and expanded in this new release. New features include a worm-specific version for investigating binding sites conserved between Caenorhabditis elegans and C. briggsae, as well as a yeast-specific version for the analysis of co-expressed sets of Saccharomyces cerevisiae genes. The human and mouse applications feature improvements in ortholog mapping, sequence alignments and the delineation of multiple alternative promoters. oPOSSUM2, introduced for the analysis of over-represented combinations of motifs in human and mouse genes, has been integrated with the original oPOSSUM system. Analysis using user-defined background gene sets is now supported. The transcription factor binding site models have been updated to include new profiles from the JASPAR database. oPOSSUM is available at http://www.cisreg.ca/oPOSSUM/ PMID:17576675

  16. Safe food and feed through an integrated toolbox for mycotoxin management: the MyToolBox approach

    NARCIS (Netherlands)

    Krska, R.; Nijs, De M.; Mcnerney, O.; Pichler, M.; Gilbert, J.; Edwards, S.; Suman, M.; Magan, N.; Rossi, V.; Fels, van der Ine; Bagi, F.; Poschmaier, B.; Sulyok, M.; Berthiller, F.; Egmond, Van H.P.

    2016-01-01

    There is a pressing need to mobilise the wealth of knowledge from the international mycotoxin research conductedover the past 25-30 years, and to perform cutting-edge research where knowledge gaps still exist. This knowledgeneeds to be integrated into affordable and practical tools for farmers and

  17. First records of tool-set use for ant-dipping by Eastern chimpanzees (Pan troglodytes schweinfurthii) in the Kalinzu Forest Reserve, Uganda.

    Science.gov (United States)

    Hashimoto, Chie; Isaji, Mina; Koops, Kathelijne; Furuichi, Takeshi

    2015-10-01

    Chimpanzees at numerous study sites are known to prey on army ants by using a single wand to dip into the ant nest or column. However, in Goualougo (Republic of Congo) in Central Africa, chimpanzees use a different technique, use of a woody sapling to perforate the ant nest, then use of a herb stem as dipping tool to harvest the army ants. Use of a tool set has also been found in Guinea, West Africa: at Seringbara in the Nimba Mountains and at nearby Bossou. There are, however, no reports for chimpanzees in East Africa. We observed use of such a tool set in Kalinzu, Uganda, for the first time by Eastern chimpanzees. This behavior was observed among one group of chimpanzees at Kalinzu (S-group) but not among the adjacent group (M-group) with partly overlapping ranging areas despite the fact that the latter group has been under intensive observation since 1997. In Uganda, ant-dipping has not been observed in the northern three sites (Budongo, Semliki, and Kibale) but has been observed or seems to occur in the southern sites (Kalinzu and Bwindi), which suggests that ant-dipping was invented by and spread from the southern region after the northern and southern forest blocks became separated. Use of a tool-set by only one group at Kalinzu further suggests that this behavior was recently invented and has not yet spread to the other group via migrating females.

  18. Development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool: An Evidence-Based Model for School Garden Integration.

    Science.gov (United States)

    Burt, Kate Gardner; Koch, Pamela; Contento, Isobel

    2017-10-01

    Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to

  19. The Plant Genome Integrative Explorer Resource: PlantGenIE.org.

    Science.gov (United States)

    Sundell, David; Mannapperuma, Chanaka; Netotea, Sergiu; Delhomme, Nicolas; Lin, Yao-Cheng; Sjödin, Andreas; Van de Peer, Yves; Jansson, Stefan; Hvidsten, Torgeir R; Street, Nathaniel R

    2015-12-01

    Accessing and exploring large-scale genomics data sets remains a significant challenge to researchers without specialist bioinformatics training. We present the integrated PlantGenIE.org platform for exploration of Populus, conifer and Arabidopsis genomics data, which includes expression networks and associated visualization tools. Standard features of a model organism database are provided, including genome browsers, gene list annotation, Blast homology searches and gene information pages. Community annotation updating is supported via integration of WebApollo. We have produced an RNA-sequencing (RNA-Seq) expression atlas for Populus tremula and have integrated these data within the expression tools. An updated version of the ComPlEx resource for performing comparative plant expression analyses of gene coexpression network conservation between species has also been integrated. The PlantGenIE.org platform provides intuitive access to large-scale and genome-wide genomics data from model forest tree species, facilitating both community contributions to annotation improvement and tools supporting use of the included data resources to inform biological insight. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  20. Analytic Tools for Feynman Integrals

    CERN Document Server

    Smirnov, Vladimir A

    2012-01-01

    The goal of this book is to describe the most powerful methods for evaluating multiloop Feynman integrals that are currently used in practice.  This book supersedes the author’s previous Springer book “Evaluating Feynman Integrals” and its textbook version “Feynman Integral Calculus.” Since the publication of these two books, powerful new methods have arisen and conventional methods have been improved on in essential ways. A further qualitative change is the fact that most of the methods and the corresponding algorithms have now been implemented in computer codes which are often public. In comparison to the two previous books, three new chapters have been added:  One is on sector decomposition, while the second describes a new method by Lee. The third new chapter concerns the asymptotic expansions of Feynman integrals in momenta and masses, which were described in detail in another Springer book, “Applied Asymptotic Expansions in Momenta and Masses,” by the author. This chapter describes, on t...

  1. To select the best tool for generating 3D maintenance data and to set the detailed process for obtaining the 3D maintenance data

    Science.gov (United States)

    Prashanth, B. N.; Roy, Kingshuk

    2017-07-01

    Three Dimensional (3D) maintenance data provides a link between design and technical documentation creating interactive 3D graphical training and maintenance material. It becomes difficult for an operator to always go through huge paper manuals or come running to the computer for doing maintenance of a machine which makes the maintenance work fatigue. Above being the case, a 3D animation makes maintenance work very simple since, there is no language barrier. The research deals with the generation of 3D maintenance data of any given machine. The best tool for obtaining the 3D maintenance is selected and the tool is analyzed. Using the same tool, a detailed process for extracting the 3D maintenance data for any machine is set. This project aims at selecting the best tool for obtaining 3D maintenance data and to select the detailed process for obtaining 3D maintenance data. 3D maintenance reduces use of big volumes of manuals which creates human errors and makes the work of an operator fatiguing. Hence 3-D maintenance would help in training and maintenance and would increase productivity. 3Dvia when compared with Cortona 3D and Deep Exploration proves to be better than them. 3Dvia is good in data translation and it has the best renderings compared to the other two 3D maintenance software. 3Dvia is very user friendly and it has various options for creating 3D animations. Its Interactive Electronic Technical Publication (IETP) integration is also better than the other two software. Hence 3Dvia proves to be the best software for obtaining 3D maintenance data of any machine.

  2. The process of care in integrative health care settings - a qualitative study of US practices.

    Science.gov (United States)

    Grant, Suzanne J; Bensoussan, Alan

    2014-10-23

    There is a lack of research on the organisational operations of integrative healthcare (IHC) practices. IHC is a therapeutic strategy integrating conventional and complementary medicine in a shared context to administer individualized treatment. To better understand the process of care in IHC - the way in which patients are triaged and treatment plans are constructed, interviews were conducted with integrative health care leaders and practitioners in the US. Semi-structured interviews were conducted with a pragmatic group of fourteen leaders and practitioners from nine different IHC settings. All interviews were conducted face-to-face with the exception of one phone interview. Questions focussed on understanding the "process of care" in an integrative healthcare setting. Deductive categories were formed from the aims of the study, focusing on: organisational structure, processes of care (subcategories: patient intake, treatment and charting, use of guidelines or protocols), prevalent diseases or conditions treated, and the role of research in the organisation. The similarities and differences of the ITH entities emerged from this process. On an organisational level, conventional and CM services and therapies were co-located in all nine settings. For patients, this means there is more opportunity for 'seamless care'. Shared information systems enabled easy communication using internal messaging or email systems, and shared patient intake information. But beyond this infrastructure alignment for integrative health care was less supported. There were no use of protocols or guidelines within any centre, no patient monitoring mechanism beyond that which occurred within one-on-one appointments. Joint planning for a patient treatment was typically ad hoc through informal mechanisms. Additional duties typically come at a direct financial cost to fee-for-service practitioners. In contrast, service delivery and the process of care within hospital inpatient services followed

  3. Design Tools for Reconfigurable Hardware in Orbit (RHinO)

    Science.gov (United States)

    French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian

    2004-01-01

    The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.

  4. Visual-haptic integration with pliers and tongs: signal ‘weights’ take account of changes in haptic sensitivity caused by different tools

    Directory of Open Access Journals (Sweden)

    Chie eTakahashi

    2014-02-01

    Full Text Available When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the ‘weight’ given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots with different ‘gains’ between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber’s law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modelled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimising the

  5. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  6. AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Potts, T. Todd; Hylko, James M.; Douglas, Terence A.

    2003-02-27

    WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHA then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment

  7. An integrated simulation tool for analyzing the Operation and Interdependency of Natural Gas and Electric Power Systems

    OpenAIRE

    PAMBOUR Kwabena A.; CAKIR BURCIN; BOLADO LAVIN Ricardo; DIJKEMA Gerard

    2016-01-01

    In this paper, we present an integrated simulation tool for analyzing the interdependency of natural gas and electric power systems in terms of security of energy supply. In the first part, we develop mathematical models for the individual systems. In part two, we identify the interconnections between both systems and propose a method for coupling the combined simulation model. Next, we develop the algorithm for solving the combined system and integrate this algorithm into a simulation softwa...

  8. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    Directory of Open Access Journals (Sweden)

    Chie Takahashi

    2011-10-01

    Full Text Available Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009. Variations in tool geometry also affect the reliability (precision of haptic size estimates, however, because they alter the change in hand opening caused by a given change in object size. Here, we examine whether the brain appropriately adjusts the weights given to visual and haptic size signals when tool geometry changes. We first estimated each cue's reliability by measuring size-discrimination thresholds in vision-alone and haptics-alone conditions. We varied haptic reliability using tools with different object-size:hand-opening ratios (1:1, 0.7:1, and 1.4:1. We then measured the weights given to vision and haptics with each tool, using a cue-conflict paradigm. The weight given to haptics varied with tool type in a manner that was well predicted by the single-cue reliabilities (MLE model; Ernst and Banks, 2002. This suggests that the process of visual-haptic integration appropriately accounts for variations in haptic reliability introduced by different tool geometries.

  9. VarB Plus: An Integrated Tool for Visualization of Genome Variation Datasets

    KAUST Repository

    Hidayah, Lailatul

    2012-07-01

    Research on genomic sequences has been improving significantly as more advanced technology for sequencing has been developed. This opens enormous opportunities for sequence analysis. Various analytical tools have been built for purposes such as sequence assembly, read alignments, genome browsing, comparative genomics, and visualization. From the visualization perspective, there is an increasing trend towards use of large-scale computation. However, more than power is required to produce an informative image. This is a challenge that we address by providing several ways of representing biological data in order to advance the inference endeavors of biologists. This thesis focuses on visualization of variations found in genomic sequences. We develop several visualization functions and embed them in an existing variation visualization tool as extensions. The tool we improved is named VarB, hence the nomenclature for our enhancement is VarB Plus. To the best of our knowledge, besides VarB, there is no tool that provides the capability of dynamic visualization of genome variation datasets as well as statistical analysis. Dynamic visualization allows users to toggle different parameters on and off and see the results on the fly. The statistical analysis includes Fixation Index, Relative Variant Density, and Tajima’s D. Hence we focused our efforts on this tool. The scope of our work includes plots of per-base genome coverage, Principal Coordinate Analysis (PCoA), integration with a read alignment viewer named LookSeq, and visualization of geo-biological data. In addition to description of embedded functionalities, significance, and limitations, future improvements are discussed. The result is four extensions embedded successfully in the original tool, which is built on the Qt framework in C++. Hence it is portable to numerous platforms. Our extensions have shown acceptable execution time in a beta testing with various high-volume published datasets, as well as positive

  10. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  11. Developmental Screening Tools: Feasibility of Use at Primary Healthcare Level in Low- and Middle-income Settings

    OpenAIRE

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-01-01

    ABSTRACT An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducte...

  12. System and method for integrating hazard-based decision making tools and processes

    Science.gov (United States)

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  13. Double path-integral migration velocity analysis: a real data example

    International Nuclear Information System (INIS)

    Costa, Jessé C; Schleicher, Jörg

    2011-01-01

    Path-integral imaging forms an image with no knowledge of the velocity model by summing over the migrated images obtained for a set of migration velocity models. Double path-integral imaging migration extracts the stationary velocities, i.e. those velocities at which common-image gathers align horizontally, as a byproduct. An application of the technique to a real data set demonstrates that quantitative information about the time migration velocity model can be determined by double path-integral migration velocity analysis. Migrated images using interpolations with different regularizations of the extracted velocities prove the high quality of the resulting time-migration velocity information. The so-obtained velocity model can then be used as a starting model for subsequent velocity analysis tools like migration tomography or other tomographic methods

  14. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  15. Blended Learning Tools in Geosciences: A New Set of Online Tools to Help Students Master Skills

    Science.gov (United States)

    Cull, S.; Spohrer, J.; Natarajan, S.; Chin, M.

    2013-12-01

    In most geoscience courses, students are expected to develop specific skills. To master these skills, students need to practice them repeatedly. Unfortunately, few geosciences courses have enough class time to allow students sufficient in-class practice, nor enough instructor attention and time to provide fast feedback. To address this, we have developed an online tool called an Instant Feedback Practice (IFP). IFPs are low-risk, high-frequency exercises that allow students to practice skills repeatedly throughout a semester, both in class and at home. After class, students log onto a course management system (like Moodle or Blackboard), and click on that day's IFP exercise. The exercise might be visually identifying a set of minerals that they're practicing. After answering each question, the IFP tells them if they got it right or wrong. If they got it wrong, they try again until they get it right. There is no penalty - students receive the full score for finishing. The goal is low-stakes practice. By completing dozens of these practices throughout the semester, students have many, many opportunities to practice mineral identification with quick feedback. Students can also complete IFPs during class in groups and teams, with in-lab hand samples or specimens. IFPs can also be used to gauge student skill levels as the semester progresses, as they can be set up to provide the instructor feedback on specific skills or students. When IFPs were developed for and implemented in a majors-level mineralogy class, students reported that in-class and online IFPs were by far the most useful technique they used to master mineral hand sample identification. Final grades in the course were significantly higher than historical norms, supporting students' anecdotal assessment of the impact of IFPs on their learning.

  16. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  17. Design Tools for Integrated Asynchronous Electronic Circuits

    National Research Council Canada - National Science Library

    Martin, Alain

    2003-01-01

    ..., simulation, verification, at the logical and physical levels. Situs has developed a business model for the commercialization of the CAD tools, and has designed the prototype of the tool suite based on this business model and the Caltech approach...

  18. Use cases for integrated electrical and thermal energy systems operation and control with a view on simulation tools

    DEFF Research Database (Denmark)

    Gehrke, Oliver; Richert, Thibaut Pierre

    2017-01-01

    There is a general lack of knowledge regarding energy systems coupling (also known as multi-energy systems (MES), multi-domain or integrated energy systems) and few well-defined use cases (UCs) that properly describe their operation. Energy systems coupling increases complexity due to additional...... and discuss why we consider these UCs to be the most representative of such systems. Based on these UCs we derive requirements for simulation tools and level of detail (e.g. technical and temporal resolution) to simulate MES in a holistic way. We relate these requirements to the existing tools for studying...... could have an impact into another. We show that no single tool exists to cover all UCs and why such a tool may not be desirable after all....

  19. Planning Tools For Estimating Radiation Exposure At The National Ignition Facility

    International Nuclear Information System (INIS)

    Verbeke, J.; Young, M.; Brereton, S.; Dauffy, L.; Hall, J.; Hansen, L.; Khater, H.; Kim, S.; Pohl, B.; Sitaraman, S.

    2010-01-01

    A set of computational tools was developed to help estimate and minimize potential radiation exposure to workers from material activation in the National Ignition Facility (NIF). AAMI (Automated ALARA-MCNP Interface) provides an efficient, automated mechanism to perform the series of calculations required to create dose rate maps for the entire facility with minimal manual user input. NEET (NIF Exposure Estimation Tool) is a web application that combines the information computed by AAMI with a given shot schedule to compute and display the dose rate maps as a function of time. AAMI and NEET are currently used as work planning tools to determine stay-out times for workers following a given shot or set of shots, and to help in estimating integrated doses associated with performing various maintenance activities inside the target bay. Dose rate maps of the target bay were generated following a low-yield 10 16 D-T shot and will be presented in this paper.

  20. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    Science.gov (United States)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  1. Interpersonal relationships between registered nurses and student nurses in the clinical setting--A systematic integrative review.

    Science.gov (United States)

    Rebeiro, Geraldine; Edward, Karen-leigh; Chapman, Rose; Evans, Alicia

    2015-12-01

    A significant proportion of undergraduate nursing education occurs in the clinical setting in the form of practising skills and competencies, and is a requirement of all nursing curriculum for registration to practice. Education in the clinical setting is facilitated by registered nurses, yet this interpersonal relationship has not been examined well. To investigate the experience of interpersonal relationships between registered nurses and student nurses in the clinical setting from the point of view of the registered nurse. Integrative review Review methods: The databases of MEDLINE, CINAHL and OVID were searched. Key words used included: Registered Nurse, Preceptor, Buddy Nurse, Clinical Teacher, Mentor, Student Nurse, Nursing Student, Interpersonal Relationships, Attitudes and Perceptions. Additional review of the literature was manually undertaken through university library textbooks. 632 abstracts were returned after duplicates were removed. Twenty one articles were identified for full text read (quantitative n=2, mixed n=6, qualitative n=14); of these, seven articles addressed the experience of interpersonal relationships between registered nurses and student nurses in the clinical setting from the point of view of the registered nurse and these were reviewed. Providing education for registered nurses to enable them to lead student education in the clinical setting communicates the organizational value of the role. Registered nurses identified being supported in having the time-to-teach were considered important in facilitation of the clinical teaching role. The integrative review did not provide evidence related to the impact diverse clinical settings can have on the relationships between registered nurses and student nurses revealing an area for further examination. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  2. Soft sets combined with interval valued intuitionistic fuzzy sets of type-2 and rough sets

    Directory of Open Access Journals (Sweden)

    Anjan Mukherjee

    2015-03-01

    Full Text Available Fuzzy set theory, rough set theory and soft set theory are all mathematical tools dealing with uncertainties. The concept of type-2 fuzzy sets was introduced by Zadeh in 1975 which was extended to interval valued intuitionistic fuzzy sets of type-2 by the authors.This paper is devoted to the discussions of the combinations of interval valued intuitionistic sets of type-2, soft sets and rough sets.Three different types of new hybrid models, namely-interval valued intuitionistic fuzzy soft sets of type-2, soft rough interval valued intuitionistic fuzzy sets of type-2 and soft interval valued intuitionistic fuzzy rough sets of type-2 are proposed and their properties are derived.

  3. Tool or Toy? Virtual Globes in Landscape Planning

    Directory of Open Access Journals (Sweden)

    Stephen R. J. Sheppard

    2011-10-01

    Full Text Available Virtual globes, i.e., geobrowsers that integrate multi-scale and temporal data from various sources and are based on a globe metaphor, have developed into serious tools that practitioners and various stakeholders in landscape and community planning have started using. Although these tools originate from Geographic Information Systems (GIS, they have become a different, potentially interactive and public tool set, with their own specific limitations and new opportunities. Expectations regarding their utility as planning and community engagement tools are high, but are tempered by both technical limitations and ethical issues [1,2]. Two grassroots campaigns and a collaborative visioning process, the Kimberley Climate Adaptation Project case study (British Columbia, illustrate and broaden our understanding of the potential benefits and limitations associated with the use of virtual globes in participatory planning initiatives. Based on observations, questionnaires and in-depth interviews with stakeholders and community members using an interactive 3D model of regional climate change vulnerabilities, potential impacts, and possible adaptation and mitigation scenarios in Kimberley, the benefits and limitations of virtual globes as a tool for participatory landscape planning are discussed. The findings suggest that virtual globes can facilitate access to geospatial information, raise awareness, and provide a more representative virtual landscape than static visualizations. However, landscape is not equally representative at all scales, and not all types of users seem to benefit equally from the tool. The risks of misinterpretation can be managed by integrating the application and interpretation of virtual globes into face-to-face planning processes.

  4. A web GIS based integrated flood assessment modeling tool for coastal urban watersheds

    Science.gov (United States)

    Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.

    2014-03-01

    Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.

  5. SOCIOCULTURAL INTEGRATION AS A TOOL FOR CONSTRUCTIVE CONFLICT RESOLUTION: THE CASE OF THE NORTH CAUCASUS

    Directory of Open Access Journals (Sweden)

    M. E. Popov

    2017-01-01

    Full Text Available The paper is devoted to research of sociocultural integration as a tool for resolving regional conflicts. The modern theory of conflict resolution focuses on the ability of the sociocultural integration in the transformation of destructive identity-based conflicts into conflicts of interest. The author considers the systemic factors of the identity-based conflicts and emphasizes destabilizing role of the politicization of ethnicity. Ethnic mobilization, social inequalities, economic polarization and civic identity crisis are structural factors that determine the acuity of ethnic tension and escalation of regional identity conflicts as a result. Contradictions between the modernization system and social disintegration are the primary source of identity conflicts in theNorth Caucasus. Regionalization takes conflictogenic form in this case, i.e. the specifics of regional conflicts is associated with a conflict of static (traditionalization and dynamic (modernization types of social propagation. Structurally, escalation of violence in regional conflicts is determined by the intensity and scope of ethnic mobilization and social dissatisfaction as necessary conditions of a collision. Regional conflicts affect existentially meaningful collective values and group identities, that is why the participants are involved emotionally into identification conflicts; due to their emotional charge and irrationality, identity conflicts are no longer a means of overcoming social frustrations, but a destructive goal in itself, i.e. ethnicity polarization and negative cultural stereotypes in perceiving “the others” play a key role in initiating such conflicts. The following must be considered for discussing anti-conflict mechanisms of sociocultural integration in theNorth Caucasus. First, sociocultural integration is a political project with its content determined to a wide extent by defense challenges of the polyethnic Russian society. Second, development of the

  6. The Integrated Medical Model: A Risk Assessment and Decision Support Tool for Human Space Flight Missions

    Science.gov (United States)

    Kerstman, Eric L.; Minard, Charles; FreiredeCarvalho, Mary H.; Walton, Marlei E.; Myers, Jerry G., Jr.; Saile, Lynn G.; Lopez, Vilma; Butler, Douglas J.; Johnson-Throop, Kathy A.

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) and its use as a risk assessment and decision support tool for human space flight missions. The IMM is an integrated, quantified, evidence-based decision support tool useful to NASA crew health and mission planners. It is intended to assist in optimizing crew health, safety and mission success within the constraints of the space flight environment for in-flight operations. It uses ISS data to assist in planning for the Exploration Program and it is not intended to assist in post flight research. The IMM was used to update Probability Risk Assessment (PRA) for the purpose of updating forecasts for the conditions requiring evacuation (EVAC) or Loss of Crew Life (LOC) for the ISS. The IMM validation approach includes comparison with actual events and involves both qualitative and quantitaive approaches. The results of these comparisons are reviewed. Another use of the IMM is to optimize the medical kits taking into consideration the specific mission and the crew profile. An example of the use of the IMM to optimize the medical kits is reviewed.

  7. Semi-automatic Data Integration using Karma

    Science.gov (United States)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of

  8. Examination of the low frequency limit for helicopter noise data in the Federal Aviation Administration's Aviation Environmental Design Tool and Integrated Noise Model

    Science.gov (United States)

    2010-04-19

    The Federal Aviation Administration (FAA) aircraft noise modeling tools Aviation Environmental Design Tool (AEDTc) and Integrated Noise Model (INM) do not currently consider noise below 50 Hz in their computations. This paper describes a preliminary ...

  9. EasyCloneMulti: A Set of Vectors for Simultaneous and Multiple Genomic Integrations in Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Maury, Jerome; Germann, Susanne Manuela; Jacobsen, Simo Abdessamad

    2016-01-01

    Saccharomyces cerevisiae is widely used in the biotechnology industry for production of ethanol, recombinant proteins, food ingredients and other chemicals. In order to generate highly producing and stable strains, genome integration of genes encoding metabolic pathway enzymes is the preferred...... of integrative vectors, EasyCloneMulti, that enables multiple and simultaneous integration of genes in S. cerevisiae. By creating vector backbones that combine consensus sequences that aim at targeting subsets of Ty sequences and a quickly degrading selective marker, integrations at multiple genomic loci...... and a range of expression levels were obtained, as assessed with the green fluorescent protein (GFP) reporter system. The EasyCloneMulti vector set was applied to balance the expression of the rate-controlling step in the β-alanine pathway for biosynthesis of 3-hydroxypropionic acid (3HP). The best 3HP...

  10. Integrating best evidence into patient care: a process facilitated by a seamless integration with informatics tools.

    Science.gov (United States)

    Giuse, Nunzia B; Williams, Annette M; Giuse, Dario A

    2010-07-01

    The Vanderbilt University paper discusses how the Eskind Biomedical Library at Vanderbilt University Medical Center transitioned from a simplistic approach that linked resources to the institutional electronic medical record system, StarPanel, to a value-added service that is designed to deliver highly relevant information. Clinical teams formulate complex patient-specific questions via an evidence-based medicine literature request basket linked to individual patient records. The paper transitions into discussing how the StarPanel approach acted as a springboard for two additional projects that use highly trained knowledge management librarians with informatics expertise to integrate evidence into both order sets and a patient portal, MyHealth@Vanderbilt.

  11. Adapting a Technology-Based Implementation Support Tool for Community Mental Health: Challenges and Lessons Learned.

    Science.gov (United States)

    Livet, Melanie; Fixsen, Amanda

    2018-01-01

    With mental health services shifting to community-based settings, community mental health (CMH) organizations are under increasing pressure to deliver effective services. Despite availability of evidence-based interventions, there is a gap between effective mental health practices and the care that is routinely delivered. Bridging this gap requires availability of easily tailorable implementation support tools to assist providers in implementing evidence-based intervention with quality, thereby increasing the likelihood of achieving the desired client outcomes. This study documents the process and lessons learned from exploring the feasibility of adapting such a technology-based tool, Centervention, as the example innovation, for use in CMH settings. Mixed-methods data on core features, innovation-provider fit, and organizational capacity were collected from 44 CMH providers. Lessons learned included the need to augment delivery through technology with more personal interactions, the importance of customizing and integrating the tool with existing technologies, and the need to incorporate a number of strategies to assist with adoption and use of Centervention-like tools in CMH contexts. This study adds to the current body of literature on the adaptation process for technology-based tools and provides information that can guide additional innovations for CMH settings.

  12. Tablets in K-12 Education: Integrated Experiences and Implications

    Science.gov (United States)

    An, Heejung, Ed.; Alon, Sandra, Ed.; Fuentes, David, Ed.

    2015-01-01

    The inclusion of new and emerging technologies in the education sector has been a topic of interest to researchers, educators, and software developers alike in recent years. Utilizing the proper tools in a classroom setting is a critical factor in student success. "Tablets in K-12 Education: Integrated Experiences and Implications"…

  13. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Science.gov (United States)

    Siegelmann-Danieli, Nava; Farkash, Ariel; Katzir, Itzhak; Vesterman Landes, Janet; Rotem Rabinovich, Hadas; Lomnicky, Yossef; Carmeli, Boaz; Parush-Shear-Yashuv, Naama

    2016-01-01

    Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC) setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes. The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological) per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP), FP plus oxaliplatin (FP-O), or FP plus irinotecan (FP-I) in the first-line between 9/2006 and 12/2013. The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group). For the entire cohort, median overall survival (OS) was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT) pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p controlling for age and gender) identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton-pump inhibitors. Our tool provided insights that confirmed/complemented information gained from randomized-clinical trials. Prospective tool implementation is warranted.

  14. Piloting a method to evaluate the implementation of integrated water ...

    African Journals Online (AJOL)

    Journal Home > Vol 41, No 5 (2015) >. Log in or Register to get access to full text downloads. ... A methodology with a set of principles, change areas and measures was developed as a performance assessment tool. ... Keywords: Integrated water resource management, Inkomati River Basin, South Africa, Swaziland ...

  15. Thinking Critically about Critical Thinking: Integrating Online Tools to Promote Critical Thinking

    Directory of Open Access Journals (Sweden)

    B. Jean Mandernach

    2006-01-01

    Full Text Available The value and importance of critical thinking is clearly established; the challenge for instructors lies in successfully promoting students’ critical thinking skills within the confines of a traditional classroom experience. Since instructors are faced with limited student contact time to meet their instructional objectives and facilitate learning, they are often forced to make instructional decisions between content coverage, depth of understanding, and critical analysis of course material. To address this dilemma, it is essential to integrate instructional strategies and techniques that can efficiently and effectively maximize student learning and critical thinking. Modern advances in educational technology have produced a range of online tools to assist instructors in meeting this instructional goal. This review will examine the theoretical foundations of critical thinking in higher education, discuss empirically-based strategies for integrating online instructional supplements to enhance critical thinking, offer techniques for expanding instructional opportunities outside the limitations of traditional class time, and provide practical suggestions for the innovative use of critical thinking strategies via online resources.

  16. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform

    Directory of Open Access Journals (Sweden)

    List Markus

    2017-06-01

    Full Text Available Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  17. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    Science.gov (United States)

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  18. PREP: Production and Reprocessing management tool for CMS

    International Nuclear Information System (INIS)

    Cossutti, F; Lenzi, P; Naziridis, N; Samyn, D; Stöckli, F

    2012-01-01

    The production of simulated samples for physics analysis at LHC represents a noticeable organization challenge, because it requires the management of several thousands different workflows. The submission of a workflow to the grid based computing infrastructure starts with the definition of the general characteristics of a given set of coherent samples (called a ‘campaign'), up to the definition of the physics settings to be used for each sample corresponding to a specific process to be simulated, both at hard event generation and detector simulation level. In order to have an organized control of the of the definition of the large number of MC samples needed by CMS, a dedicated management tool, called PREP, has been built. Its basic component is a database storing all the relevant information about the sample and the actions implied by the workflow definition, approval and production. A web based interface allows the database to be used from experts involved in production to trigger all the different actions needed, as well as by normal physicists involved in analyses to retrieve the relevant information. The tool is integrated through a set of dedicated APIs with the production agent and information storage utilities of CMS.

  19. Setting the right path and pace for integration.

    Science.gov (United States)

    Cwiek, Katherine A; Inniger, Meredith C; Zismer, Daniel K

    2014-04-01

    Far from being a monolithic trend, integration in health care today is progressing in various forms, and at different rates in different markets within and across the range of healthcare organizations. Each organization should develop a tailored strategy that delineates the level and type of integration it will pursue and at what pace to pursue it. This effort will require evaluation of external market conditions with respect to integration and competition and a candid assessment of intraorganizational integration. The compared results of the two analyses will provide the basis for formulating strategy.

  20. Multi-particle phase space integration with arbitrary set of singularities in CompHEP

    International Nuclear Information System (INIS)

    Kovalenko, D.N.; Pukhov, A.E.

    1997-01-01

    We describe an algorithm of multi-particle phase space integration for collision and decay processes realized in CompHEP package version 3.2. In the framework of this algorithm it is possible to regularize an arbitrary set of singularities caused by virtual particle propagators. The algorithm is based on the method of the recursive representation of kinematics and on the multichannel Monte Carlo approach. CompHEP package is available by WWW: http://theory.npi.msu.su/pukhov/comphep. html (orig.)

  1. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  2. Lebesgue Sets Immeasurable Existence

    Directory of Open Access Journals (Sweden)

    Diana Marginean Petrovai

    2012-12-01

    Full Text Available It is well known that the notion of measure and integral were released early enough in close connection with practical problems of measuring of geometric figures. Notion of measure was outlined in the early 20th century through H. Lebesgue’s research, founder of the modern theory of measure and integral. It was developed concurrently a technique of integration of functions. Gradually it was formed a specific area todaycalled the measure and integral theory. Essential contributions to building this theory was made by a large number of mathematicians: C. Carathodory, J. Radon, O. Nikodym, S. Bochner, J. Pettis, P. Halmos and many others. In the following we present several abstract sets, classes of sets. There exists the sets which are not Lebesgue measurable and the sets which are Lebesgue measurable but are not Borel measurable. Hence B ⊂ L ⊂ P(X.

  3. How Philadelphia is Integrating Climate Science and Policy: Changing Capital Planning Processes and Developing Flood-Depth Tools

    Science.gov (United States)

    Bhat, C.; Dix, B.; Choate, A.; Wong, A.; Asam, S.; Schultz, P. A.

    2016-12-01

    Policy makers can implement more effective climate change adaptation programs if they are provided with two tools: accessible information on the impacts that they need to prepare for, and clear guidance on how to integrate climate change considerations into their work. This presentation will highlight recent and ongoing efforts at the City of Philadelphia to integrate climate science into their decision-making. These efforts include developing a climate change information visualization tool, climate change risk assessments across the city, and processes to integrate climate change into routine planning and budgeting practices. The goal of these efforts is to make climate change science highly targeted to decision maker needs, non-political, easily accessible, and actionable. While sea level rise inundation maps have been available to communities for years, the maps do not effectively communicate how the design of a building or a piece of infrastructure would need to be modified to protect it. The Philadelphia Flood Risk Viewer is an interactive planning tool that allows Philadelphia to identify projected depths of flooding for any location within the City, for a variety of sea level rise and storm surge scenarios. Users can also determine whether a location is located in a FEMA floodplain. By having access to information on the projected depth of flooding at a given location, the City can determine what flood protection measures may be effective, or even inform the long-term viability of developing a particular area. With an understanding of climate vulnerabilities, cities have the opportunity to make smart, climate-resilient investments with their capital budgets that will yield multiple benefits for years to come. Few, however, have established protocols for doing so. Philadelphia, with support from ICF, developed a guidance document that identifies recommendations for integrating climate change considerations throughout the Capital Program and capital budgeting

  4. Set up of a method for the adjustment of resonance parameters on integral experiments

    International Nuclear Information System (INIS)

    Blaise, P.

    1996-01-01

    Resonance parameters for actinides play a significant role in the neutronic characteristics of all reactor types. All the major integral parameters strongly depend on the nuclear data of the isotopes in the resonance-energy regions.The author sets up a method for the adjustment of resonance parameters taking into account the self-shielding effects and restricting the cross section deconvolution problem to a limited energy region. (N.T.)

  5. Rough Sets as a Knowledge Discovery and Classification Tool for the Diagnosis of Students with Learning Disabilities

    Directory of Open Access Journals (Sweden)

    Yu-Chi Lin

    2011-02-01

    Full Text Available Due to the implicit characteristics of learning disabilities (LDs, the diagnosis of students with learning disabilities has long been a difficult issue. Artificial intelligence techniques like artificial neural network (ANN and support vector machine (SVM have been applied to the LD diagnosis problem with satisfactory outcomes. However, special education teachers or professionals tend to be skeptical to these kinds of black-box predictors. In this study, we adopt the rough set theory (RST, which can not only perform as a classifier, but may also produce meaningful explanations or rules, to the LD diagnosis application. Our experiments indicate that the RST approach is competitive as a tool for feature selection, and it performs better in term of prediction accuracy than other rulebased algorithms such as decision tree and ripper algorithms. We also propose to mix samples collected from sources with different LD diagnosis procedure and criteria. By pre-processing these mixed samples with simple and readily available clustering algorithms, we are able to improve the quality and support of rules generated by the RST. Overall, our study shows that the rough set approach, as a classification and knowledge discovery tool, may have great potential in playing an essential role in LD diagnosis.

  6. An Integrative Review of In-Class Activities That Enable Active Learning in College Science Classroom Settings

    Science.gov (United States)

    Arthurs, Leilani A.; Kreager, Bailey Zo

    2017-01-01

    Engaging students in active learning is linked to positive learning outcomes. This study aims to synthesise the peer-reviewed literature about "active learning" in college science classroom settings. Using the methodology of an integrative literature review, 337 articles archived in the Educational Resources Information Center (ERIC) are…

  7. Final Technical Report for Contract No. DE-EE0006332, "Integrated Simulation Development and Decision Support Tool-Set for Utility Market and Distributed Solar Power Generation"

    Energy Technology Data Exchange (ETDEWEB)

    Cormier, Dallas [San Diego Gas & Electric, CA (United States); Edra, Sherwin [San Diego Gas & Electric, CA (United States); Espinoza, Michael [San Diego Gas & Electric, CA (United States); Daye, Tony [Green Power Labs, San Diego, CA (United States); Kostylev, Vladimir [Green Power Labs, San Diego, CA (United States); Pavlovski, Alexandre [Green Power Labs, San Diego, CA (United States); Jelen, Deborah [Electricore, Inc., Valencia, CA (United States)

    2014-12-29

    This project will enable utilities to develop long-term strategic plans that integrate high levels of renewable energy generation, and to better plan power system operations under high renewable penetration. The program developed forecast data streams for decision support and effective integration of centralized and distributed solar power generation in utility operations. This toolset focused on real time simulation of distributed power generation within utility grids with the emphasis on potential applications in day ahead (market) and real time (reliability) utility operations. The project team developed and demonstrated methodologies for quantifying the impact of distributed solar generation on core utility operations, identified protocols for internal data communication requirements, and worked with utility personnel to adapt the new distributed generation (DG) forecasts seamlessly within existing Load and Generation procedures through a sophisticated DMS. This project supported the objectives of the SunShot Initiative and SUNRISE by enabling core utility operations to enhance their simulation capability to analyze and prepare for the impacts of high penetrations of solar on the power grid. The impact of high penetration solar PV on utility operations is not only limited to control centers, but across many core operations. Benefits of an enhanced DMS using state-of-the-art solar forecast data were demonstrated within this project and have had an immediate direct operational cost savings for Energy Marketing for Day Ahead generation commitments, Real Time Operations, Load Forecasting (at an aggregate system level for Day Ahead), Demand Response, Long term Planning (asset management), Distribution Operations, and core ancillary services as required for balancing and reliability. This provided power system operators with the necessary tools and processes to operate the grid in a reliable manner under high renewable penetration.

  8. Geometrical setting of solid mechanics

    International Nuclear Information System (INIS)

    Fiala, Zdenek

    2011-01-01

    Highlights: → Solid mechanics within the Riemannian symmetric manifold GL (3, R)/O (3, R). → Generalized logarithmic strain. → Consistent linearization. → Incremental principle of virtual power. → Time-discrete approximation. - Abstract: The starting point in the geometrical setting of solid mechanics is to represent deformation process of a solid body as a trajectory in a convenient space with Riemannian geometry, and then to use the corresponding tools for its analysis. Based on virtual power of internal stresses, we show that such a configuration space is the (globally) symmetric space of symmetric positive-definite real matrices. From this unifying point of view, we shall analyse the logarithmic strain, the stress rate, as well as linearization and intrinsic integration of corresponding evolution equation.

  9. Simulation of Logging-while-drilling Tool Response Using Integral Equation Fast Fourier Transform

    Directory of Open Access Journals (Sweden)

    Sun Xiang-Yang

    2017-01-01

    Full Text Available We rely on the volume integral equation (VIE method for the simulation of loggingwhile- drilling (LWG tool response using the integral equation fast Fourier transform (IE-FFT algorithm to accelerate the computation of the matrix-vector product in the iterative solver. Depending on the virtue of the Toeplitz structure of the interpolation of the Green’s function on the uniform Cartesian grids, this method uses FFT to calculate the matrix-vector multiplication. At the same time, this method reduce the memory requirement and CPU time. In this paper, IEFFT method is first used in the simulation of LWG. Numerical results are presented to demonstrate the accuracy and efficiency of this method. Compared with the Moment of Method (MOM and other fast algorithms, IE-FFT have distinct advantages in the fact of memory requirement and CPU time. In addition, this paper study the truncation, mesh elements, the size of the interpolation grids of IE-FFT and dip formation, and give some conclusion with wide applicability.

  10. The Integrated Waste Tracking Systems (IWTS) - A Comprehensive Waste Management Tool

    International Nuclear Information System (INIS)

    Robert S. Anderson

    2005-01-01

    The US Department of Energy (DOE) Idaho National Laboratory (INL) site located near Idaho Falls, ID USA, has developed a comprehensive waste management and tracking tool that integrates multiple operational activities with characterization data from waste declaration through final waste disposition. The Integrated Waste Tracking System (IWTS) provides information necessary to help facility personnel properly manage their waste and demonstrate a wide range of legal and regulatory compliance. As a client?server database system, the IWTS is a proven tracking, characterization, compliance, and reporting tool that meets the needs of both operations and management while providing a high level of flexibility. This paper describes some of the history involved with the development and current use of IWTS as a comprehensive waste management tool as well as a discussion of IWTS deployments performed by the INL for outside clients. Waste management spans a wide range of activities including: work group interactions, regulatory compliance management, reporting, procedure management, and similar activities. The IWTS documents these activities and performs tasks in a computer-automated environment. Waste characterization data, container characterization data, shipments, waste processing, disposals, reporting, and limit compliance checks are just a few of the items that IWTS documents and performs to help waste management personnel perform their jobs. Throughout most hazardous and radioactive waste generating, storage and disposal sites, waste management is performed by many different groups of people in many facilities. Several organizations administer their areas of waste management using their own procedures and documentation independent of other organizations. Files are kept, some of which are treated as quality records, others not as stringent. Quality records maintain a history of: changes performed after approval, the reason for the change(s), and a record of whom and when

  11. Construct-driven development of a video-based situational judgment test for integrity : A study in a multi-ethnic police setting

    NARCIS (Netherlands)

    L.A.L. de Meijer (Lonneke); M.Ph. Born (Marise); J. van Zielst (Jaap); H.T. van der Molen (Henk)

    2010-01-01

    textabstractIn a field study conducted in a multi-ethnic selection setting at the Dutch police, we examined the construct validity of a video-based situational judgment test (SJT) aimed to measure the construct of integrity. Integrity is of central importance to productive work performance of police

  12. ncRNA-class Web Tool: Non-coding RNA feature extraction and pre-miRNA classification web tool

    KAUST Repository

    Kleftogiannis, Dimitrios A.; Theofilatos, Konstantinos A.; Papadimitriou, Stergios; Tsakalidis, Athanasios K.; Likothanassis, Spiridon D.; Mavroudi, Seferina P.

    2012-01-01

    Until recently, it was commonly accepted that most genetic information is transacted by proteins. Recent evidence suggests that the majority of the genomes of mammals and other complex organisms are in fact transcribed into non-coding RNAs (ncRNAs), many of which are alternatively spliced and/or processed into smaller products. Non coding RNA genes analysis requires the calculation of several sequential, thermodynamical and structural features. Many independent tools have already been developed for the efficient calculation of such features but to the best of our knowledge there does not exist any integrative approach for this task. The most significant amount of existing work is related to the miRNA class of non-coding RNAs. MicroRNAs (miRNAs) are small non-coding RNAs that play a significant role in gene regulation and their prediction is a challenging bioinformatics problem. Non-coding RNA feature extraction and pre-miRNA classification Web Tool (ncRNA-class Web Tool) is a publicly available web tool ( http://150.140.142.24:82/Default.aspx ) which provides a user friendly and efficient environment for the effective calculation of a set of 58 sequential, thermodynamical and structural features of non-coding RNAs, plus a tool for the accurate prediction of miRNAs. © 2012 IFIP International Federation for Information Processing.

  13. Integration of relational and textual biomedical sources. A pilot experiment using a semi-automated method for logical schema acquisition.

    Science.gov (United States)

    García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J

    2010-01-01

    Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.

  14. Virtual Worlds and Gamification to Increase Integration of International Students in Higher Education: An Inclusive Design Approach

    Science.gov (United States)

    Zhang, Bo; Robb, Nigel; Eyerman, Joe; Goodman, Lizbeth

    2017-01-01

    In response to the growing trend of internationalisation in education, it is important to consider approaches to help international students integrate in their new settings. One possible approach involves the use of e-Learning tools, such as virtual worlds and gamification. To maximise the potential effectiveness of such tools, it may be…

  15. GenoSets: visual analytic methods for comparative genomics.

    Directory of Open Access Journals (Sweden)

    Aurora A Cain

    Full Text Available Many important questions in biology are, fundamentally, comparative, and this extends to our analysis of a growing number of sequenced genomes. Existing genomic analysis tools are often organized around literal views of genomes as linear strings. Even when information is highly condensed, these views grow cumbersome as larger numbers of genomes are added. Data aggregation and summarization methods from the field of visual analytics can provide abstracted comparative views, suitable for sifting large multi-genome datasets to identify critical similarities and differences. We introduce a software system for visual analysis of comparative genomics data. The system automates the process of data integration, and provides the analysis platform to identify and explore features of interest within these large datasets. GenoSets borrows techniques from business intelligence and visual analytics to provide a rich interface of interactive visualizations supported by a multi-dimensional data warehouse. In GenoSets, visual analytic approaches are used to enable querying based on orthology, functional assignment, and taxonomic or user-defined groupings of genomes. GenoSets links this information together with coordinated, interactive visualizations for both detailed and high-level categorical analysis of summarized data. GenoSets has been designed to simplify the exploration of multiple genome datasets and to facilitate reasoning about genomic comparisons. Case examples are included showing the use of this system in the analysis of 12 Brucella genomes. GenoSets software and the case study dataset are freely available at http://genosets.uncc.edu. We demonstrate that the integration of genomic data using a coordinated multiple view approach can simplify the exploration of large comparative genomic data sets, and facilitate reasoning about comparisons and features of interest.

  16. Tool life and surface integrity aspects when drilling nickel alloy

    Science.gov (United States)

    Kannan, S.; Pervaiz, S.; Vincent, S.; Karthikeyan, R.

    2018-04-01

    . Overall the results indicate that the effect of drilling and milling parameters is most marked in terms of surface quality in the circumferential direction. Material removal rates and tool flank wear must be maintained within the control limits to maintain hole integrity.

  17. An exploratory examination of the relationship between motivational factors and the degree to which the higher education faculty integrate computer-mediated communication (CMC) tools into their courses

    Science.gov (United States)

    Murage, Francis Ndwiga

    The stated research problem of this study was to examine the relationship between motivational factors and the degree to which the higher education faculty integrate CMC tools into their courses. The study population and sample involved higher education faculty teaching in science departments at one public university and three public colleges in the state of West Virginia (N = 153). A Likert-type rating scale survey was used to collect data based on the research questions. Two parts of the survey were adopted from previous studies while the other two were self-constructed. Research questions and hypothesis were analyzed using both descriptive and inferential analyses. The study results established a positive relationship between motivational factors and the degree the higher education faculty integrate CMC tools in their courses. The results in addition established that faculty are highly motivated to integrate CMC tools by intrinsic factors, moderately motivated by environmental factors and least motivated by extrinsic factors. The results also established that the most integrated CMC tools were those that support asynchronous methods of communication while the least integrated were those that support synchronous methods of communication. A major conclusion made was that members of higher education faculty are more likely to be motivated to integrate CMC tools into their courses by intrinsic factors rather than extrinsic or environmental factors. It was further concluded that intrinsic factors that supported and enhanced student learning as well as those that were altruistic in nature significantly influenced the degree of CMC integration. The study finally concluded that to larger extent, there is a relationship between motivational factors and the degree to which the higher education faculty integrate CMC tools in their courses. A major implication of this study was that institutions that wish to promote integration of CMC technologies should provide as much

  18. Developing a tool for mapping adult mental health care provision in Europe: the REMAST research protocol and its contribution to better integrated care

    Directory of Open Access Journals (Sweden)

    Luis Salvador-Carulla

    2015-12-01

    Full Text Available Introduction: Mental health care is a critical area to better understand integrated care and to pilot the different components of the integrated care model. However, there is an urgent need for better tools to compare and understand the context of integrated mental health care in Europe.Method: The REMAST tool (REFINEMENT MApping Services Tool combines a series of standardised health service research instruments and geographical information systems (GIS to develop local atlases of mental health care from the perspective of horizontal and vertical integrated care. It contains five main sections: (a Population Data; (b the Verona Socio-economic Status (SES Index; (c the Mental Health System Checklist; (d the Mental Health Services Inventory using the DESDE-LTC instrument; and (e Geographical Data.Expected results: The REMAST tool facilitates context analysis in mental health by providing the comparative rates of mental health service provision according to the availability of main types of care; care placement capacity; workforce capacity; and geographical accessibility to services in the local areas in eight study areas in Austria, England, Finland, France, Italy, Norway, Romania and Spain.Discussion: The outcomes of this project will facilitate cooperative work and knowledge transfer on mental health care to the different agencies involved in mental health planning and provision. This project would improve the information to users and society on the available resources for mental health care and system thinking at the local level by the different stakeholders. The techniques used in this project and the knowledge generated could eventually be transferred to the mapping of other fields of integrated care.

  19. Program Development Tools and Infrastructures

    International Nuclear Information System (INIS)

    Schulz, M.

    2012-01-01

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  20. Program Development Tools and Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, M

    2012-03-12

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  1. Terminology tools: state of the art and practical lessons.

    Science.gov (United States)

    Cimino, J J

    2001-01-01

    As controlled medical terminologies evolve from simple code-name-hierarchy arrangements, into rich, knowledge-based ontologies of medical concepts, increased demands are placed on both the developers and users of the terminologies. In response, researchers have begun developing tools to address their needs. The aims of this article are to review previous work done to develop these tools and then to describe work done at Columbia University and New York Presbyterian Hospital (NYPH). Researchers working with the Systematized Nomenclature of Medicine (SNOMED), the Unified Medical Language System (UMLS), and NYPH's Medical Entities Dictionary (MED) have created a wide variety of terminology browsers, editors and servers to facilitate creation, maintenance and use of these terminologies. Although much work has been done, no generally available tools have yet emerged. Consensus on requirement for tool functions, especially terminology servers is emerging. Tools at NYPH have been used successfully to support the integration of clinical applications and the merger of health care institutions. Significant advancement has occurred over the past fifteen years in the development of sophisticated controlled terminologies and the tools to support them. The tool set at NYPH provides a case study to demonstrate one feasible architecture.

  2. Integrated hydraulic booster/tool string technology for unfreezing of stuck downhole strings in horizontal wells

    Science.gov (United States)

    Tian, Q. Z.

    2017-12-01

    It is common to use a jarring tool to unfreeze stuck downhole string. However, in a horizontal well, influenced by the friction caused by the deviated section, jarring effect is poor; on the other hand, the forcing point can be located in the horizontal section by a hydraulic booster and the friction can be reduced, but it is time-consuming and easy to break downhole string using a large-tonnage and constant pull force. A hydraulic booster - jar tool string has been developed for unfreezing operation in horizontal wells. The technical solution involves three elements: a two-stage parallel spring cylinder structure for increasing the energy storage capacity of spring accelerators; multiple groups of spring accelerators connected in series to increase the working stroke; a hydraulic booster intensifying jarring force. The integrated unfreezing tool string based on these three elements can effectively overcome the friction caused by a deviated borehole, and thus unfreeze a stuck string with the interaction of the hydraulic booster and the mechanical jar which form an alternatively dynamic load. Experimental results show that the jarring performance parameters of the hydraulic booster-jar unfreezing tool string for the horizontal wells are in accordance with original design requirements. Then field technical parameters were developed based on numerical simulation and experimental data. Field application shows that the hydraulic booster-jar unfreezing tool string is effective to free stuck downhole tools in a horizontal well, and it reduces hook load by 80% and lessens the requirement of workover equipment. This provides a new technology to unfreeze stuck downhole string in a horizontal well.

  3. Formalization of Technological Knowledge in the Field of Metallurgy using Document Classification Tools Supported with Semantic Techniques

    Directory of Open Access Journals (Sweden)

    Regulski K.

    2017-06-01

    Full Text Available The process of knowledge formalization is an essential part of decision support systems development. Creating a technological knowledge base in the field of metallurgy encountered problems in acquisition and codifying reusable computer artifacts based on text documents. The aim of the work was to adapt the algorithms for classification of documents and to develop a method of semantic integration of a created repository. Author used artificial intelligence tools: latent semantic indexing, rough sets, association rules learning and ontologies as a tool for integration. The developed methodology allowed for the creation of semantic knowledge base on the basis of documents in natural language in the field of metallurgy.

  4. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  5. Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration.

    Science.gov (United States)

    Thorvaldsdóttir, Helga; Robinson, James T; Mesirov, Jill P

    2013-03-01

    Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today's sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license.

  6. Using bio.tools to generate and annotate workbench tool descriptions [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Kenzo-Hugo Hillion

    2017-11-01

    Full Text Available Workbench and workflow systems such as Galaxy, Taverna, Chipster, or Common Workflow Language (CWL-based frameworks, facilitate the access to bioinformatics tools in a user-friendly, scalable and reproducible way. Still, the integration of tools in such environments remains a cumbersome, time consuming and error-prone process. A major consequence is the incomplete or outdated description of tools that are often missing important information, including parameters and metadata such as publication or links to documentation. ToolDog (Tool DescriptiOn Generator facilitates the integration of tools - which have been registered in the ELIXIR tools registry (https://bio.tools - into workbench environments by generating tool description templates. ToolDog includes two modules. The first module analyses the source code of the bioinformatics software with language-specific plugins, and generates a skeleton for a Galaxy XML or CWL tool description. The second module is dedicated to the enrichment of the generated tool description, using metadata provided by bio.tools. This last module can also be used on its own to complete or correct existing tool descriptions with missing metadata.

  7. Knowledge modelling and reliability processing: presentation of the Figaro language and associated tools

    International Nuclear Information System (INIS)

    Bouissou, M.; Villatte, N.; Bouhadana, H.; Bannelier, M.

    1991-12-01

    EDF has been developing for several years an integrated set of knowledge-based and algorithmic tools for automation of reliability assessment of complex (especially sequential) systems. In this environment, the reliability expert has at his disposal all the powerful software tools for qualitative and quantitative processing, besides he gets various means to generate automatically the inputs for these tools, through the acquisition of graphical data. The development of these tools has been based on FIGARO, a specific language, which was built to get an homogeneous system modelling. Various compilers and interpreters get a FIGARO model into conventional models, such as fault-trees, Markov chains, Petri Networks. In this report, we introduce the main basics of FIGARO language, illustrating them with examples

  8. Reliability and validity of a novel tool to comprehensively assess food and beverage marketing in recreational sport settings.

    Science.gov (United States)

    Prowse, Rachel J L; Naylor, Patti-Jean; Olstad, Dana Lee; Carson, Valerie; Mâsse, Louise C; Storey, Kate; Kirk, Sara F L; Raine, Kim D

    2018-05-31

    Current methods for evaluating food marketing to children often study a single marketing channel or approach. As the World Health Organization urges the removal of unhealthy food marketing in children's settings, methods that comprehensively explore the exposure and power of food marketing within a setting from multiple marketing channels and approaches are needed. The purpose of this study was to test the inter-rater reliability and the validity of a novel settings-based food marketing audit tool. The Food and beverage Marketing Assessment Tool for Settings (FoodMATS) was developed and its psychometric properties evaluated in five public recreation and sport facilities (sites) and subsequently used in 51 sites across Canada for a cross-sectional analysis of food marketing. Raters recorded the count of food marketing occasions, presence of child-targeted and sports-related marketing techniques, and the physical size of marketing occasions. Marketing occasions were classified by healthfulness. Inter-rater reliability was tested using Cohen's kappa (κ) and intra-class correlations (ICC). FoodMATS scores for each site were calculated using an algorithm that represented the theoretical impact of the marketing environment on food preferences, purchases, and consumption. Higher FoodMATS scores represented sites with higher exposure to, and more powerful (unhealthy, child-targeted, sports-related, large) food marketing. Validity of the scoring algorithm was tested through (1) Pearson's correlations between FoodMATS scores and facility sponsorship dollars, and (2) sequential multiple regression for predicting "Least Healthy" food sales from FoodMATS scores. Inter-rater reliability was very good to excellent (κ = 0.88-1.00, p marketing in recreation facilities, the FoodMATS provides a novel means to comprehensively track changes in food marketing environments that can assist in developing and monitoring the impact of policies and interventions.

  9. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  10. The youth sports club as a health-promoting setting: An integrative review of research

    Science.gov (United States)

    Quennerstedt, Mikael; Eriksson, Charli

    2013-01-01

    Aims: The aims of this review is to compile and identify key issues in international research about youth sports clubs as health-promoting settings, and then discuss the results of the review in terms of a framework for the youth sports club as a health-promoting setting. Methods: The framework guiding this review of research is the health-promoting settings approach introduced by the World Health Organization (WHO). The method used is the integrated review. Inclusion criteria were, first, that the studies concerned sports clubs for young people, not professional clubs; second, that it be a question of voluntary participation in some sort of ongoing organized athletics outside of the regular school curricula; third, that the studies consider issues about youth sports clubs in terms of health-promoting settings as described by WHO. The final sample for the review consists of 44 publications. Results: The review shows that youth sports clubs have plentiful opportunities to be or become health-promoting settings; however this is not something that happens automatically. To do so, the club needs to include an emphasis on certain important elements in its strategies and daily practices. The youth sports club needs to be a supportive and healthy environment with activities designed for and adapted to the specific age-group or stage of development of the youth. Conclusions: To become a health-promoting setting, a youth sports club needs to take a comprehensive approach to its activities, aims, and purposes. PMID:23349167

  11. Mixed methods evaluation of a quality improvement and audit tool for nurse-to-nurse bedside clinical handover in ward settings.

    Science.gov (United States)

    Redley, Bernice; Waugh, Rachael

    2018-04-01

    Nurse bedside handover quality is influenced by complex interactions related to the content, processes used and the work environment. Audit tools are seldom tested in 'real' settings. Examine the reliability, validity and usability of a quality improvement tool for audit of nurse bedside handover. Naturalistic, descriptive, mixed-methods. Six inpatient wards at a single large not-for-profit private health service in Victoria, Australia. Five nurse experts and 104 nurses involved in 199 change-of-shift bedside handovers. A focus group with experts and pilot test were used to examine content and face validity, and usability of the handover audit tool. The tool was examined for inter-rater reliability and usability using observation audits of handovers across six wards. Data were collected in 2013-2014. Two independent observers for 72 audits demonstrated acceptable inter-observer agreement for 27 (77%) items. Reliability was weak for items examining the handover environment. Seventeen items were not observed reflecting gaps in practices. Across 199 observation audits, gaps in nurse bedside handover practice most often related to process and environment, rather than content items. Usability was impacted by high observer burden, familiarity and non-specific illustrative behaviours. The reliability and validity of most items to audit handover content was acceptable. Gaps in practices for process and environment items were identified. Context specific exemplars and reducing the items used at each handover audit can enhance usability. Further research is needed to develop context specific exemplars and undertake additional reliability testing using a wide range of handover settings. CONTRIBUTION OF THE PAPER. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    Directory of Open Access Journals (Sweden)

    Sandeep R Panta

    2016-03-01

    Full Text Available In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS. We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed stochastic neighbor embedding (t-SNE algorithm which reduces the number of dimensions for each scan in the input data set to two dimensions while preserving the local structure of data sets. Finally, we interactively display the output of this approach via a web-page, based on data driven documents (D3 JavaScript library. Two distinct approaches were used to visualize the data. In the first approach, we computed multiple quality control (QC values from pre-processed data, which were used as inputs to the t-SNE algorithm. This approach helps in assessing the quality of each data set relative to others. In the second case, computed variables of interest (e.g. brain volume or voxel values from segmented gray matter images were used as inputs to the t-SNE algorithm. This approach helps in identifying interesting patterns in the data sets. We demonstrate these approaches using multiple examples including 1 quality control measures calculated from phantom data over time, 2 quality control data from human functional MRI data across various studies, scanners, sites, 3 volumetric and density measures from human structural MRI data across various studies, scanners and sites. Results from (1 and (2 show the potential of our approach to combine t-SNE data reduction with interactive color coding of variables of interest to quickly identify visually unique clusters of data (i.e. data sets with poor QC, clustering of data by site quickly. Results from (3 demonstrate

  13. Opportunites for Integrated Landscape Planning – the Broker, the Arena, the Tool

    Directory of Open Access Journals (Sweden)

    Julia Carlsson

    2017-12-01

    Full Text Available As an integrated social and ecological system, the forest landscape includes multiple values. The need for a landscape pproach in land use planning is being increasingly advocated in research, policy and practice. This paper explores how institutional conditions in the forest policy and management sector can be developed to meet demands for a multifunctional landscape perspective. Departing from obstacles recognised in collaborative planning literature, we build an analytical framework which is operationalised in a Swedish context at municipal level. Our case illustrating this is Vilhelmina Model Forest, where actual barriers and opportunities for a multiple-value landscape approach are identified through 32 semi-structured interviews displaying stakeholders’ views on forest values,ownership rights and willingness to consider multiple values, forest policy and management premises, and collaboration. As an opportunity to overcome the barriers, we suggest and discuss three key components by which an integrated landscape planning approach could be realized in forest management planning: the need for a landscape coordinator (broker, the need for a collaborative forum (arena, and the development of the existing forest management plan into an advanced multifunctional landscape plan (tool.

  14. GenomeCAT: a versatile tool for the analysis and integrative visualization of DNA copy number variants.

    Science.gov (United States)

    Tebel, Katrin; Boldt, Vivien; Steininger, Anne; Port, Matthias; Ebert, Grit; Ullmann, Reinhard

    2017-01-06

    The analysis of DNA copy number variants (CNV) has increasing impact in the field of genetic diagnostics and research. However, the interpretation of CNV data derived from high resolution array CGH or NGS platforms is complicated by the considerable variability of the human genome. Therefore, tools for multidimensional data analysis and comparison of patient cohorts are needed to assist in the discrimination of clinically relevant CNVs from others. We developed GenomeCAT, a standalone Java application for the analysis and integrative visualization of CNVs. GenomeCAT is composed of three modules dedicated to the inspection of single cases, comparative analysis of multidimensional data and group comparisons aiming at the identification of recurrent aberrations in patients sharing the same phenotype, respectively. Its flexible import options ease the comparative analysis of own results derived from microarray or NGS platforms with data from literature or public depositories. Multidimensional data obtained from different experiment types can be merged into a common data matrix to enable common visualization and analysis. All results are stored in the integrated MySQL database, but can also be exported as tab delimited files for further statistical calculations in external programs. GenomeCAT offers a broad spectrum of visualization and analysis tools that assist in the evaluation of CNVs in the context of other experiment data and annotations. The use of GenomeCAT does not require any specialized computer skills. The various R packages implemented for data analysis are fully integrated into GenomeCATs graphical user interface and the installation process is supported by a wizard. The flexibility in terms of data import and export in combination with the ability to create a common data matrix makes the program also well suited as an interface between genomic data from heterogeneous sources and external software tools. Due to the modular architecture the functionality of

  15. INTEGRATION OF COST MODELS AND PROCESS SIMULATION TOOLS FOR OPTIMUM COMPOSITE MANUFACTURING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Pack, Seongchan [General Motors; Wilson, Daniel [General Motors; Aitharaju, Venkat [General Motors; Kia, Hamid [General Motors; Yu, Hang [ESI, Group.; Doroudian, Mark [ESI Group

    2017-09-05

    Manufacturing cost of resin transfer molded composite parts is significantly influenced by the cycle time, which is strongly related to the time for both filling and curing of the resin in the mold. The time for filling can be optimized by various injection strategies, and by suitably reducing the length of the resin flow distance during the injection. The curing time can be reduced by the usage of faster curing resins, but it requires a high pressure injection equipment, which is capital intensive. Predictive manufacturing simulation tools that are being developed recently for composite materials are able to provide various scenarios of processing conditions virtually well in advance of manufacturing the parts. In the present study, we integrate the cost models with process simulation tools to study the influence of various parameters such as injection strategies, injection pressure, compression control to minimize high pressure injection, resin curing rate, and demold time on the manufacturing cost as affected by the annual part volume. A representative automotive component was selected for the study and the results are presented in this paper

  16. Wheat Rust Information Resources - Integrated tools and data for improved decision making

    DEFF Research Database (Denmark)

    Hodson, David; Hansen, Jens Grønbech; Lassen, Poul

    giving access to an unprecedented set of data for rust surveys, alternate hosts (barberry), rust pathotypes, trap nurseries and resistant cultivars. Standardized protocols for data collection have permitted the development of a comprehensive data management system, named the Wheat Rust Toolbox....... Integration of the CIMMYT Wheat Atlas and the Genetic Resources Information System (GRIS) databases provides a rich resource on wheat cultivars and their resistance to important rust races. Data access is facilitated via dedicated web portals such as Rust Tracker (www.rusttracker.org) and the Global Rust...

  17. An integrated knowledge-based and optimization tool for the sustainable selection of wastewater treatment process concepts

    DEFF Research Database (Denmark)

    Castillo, A.; Cheali, Peam; Gómez, V.

    2016-01-01

    The increasing demand on wastewater treatment plants (WWTPs) has involved an interest in improving the alternative treatment selection process. In this study, an integrated framework including an intelligent knowledge-based system and superstructure-based optimization has been developed and applied...... to a real case study. Hence, a multi-criteria analysis together with mathematical models is applied to generate a ranked short-list of feasible treatments for three different scenarios. Finally, the uncertainty analysis performed allows for increasing the quality and robustness of the decisions considering...... benefit and synergy is achieved when both tools are integrated because expert knowledge and expertise are considered together with mathematical models to select the most appropriate treatment alternative...

  18. From data to the decision: A software architecture to integrate predictive modelling in clinical settings.

    Science.gov (United States)

    Martinez-Millana, A; Fernandez-Llatas, C; Sacchi, L; Segagni, D; Guillen, S; Bellazzi, R; Traver, V

    2015-08-01

    The application of statistics and mathematics over large amounts of data is providing healthcare systems with new tools for screening and managing multiple diseases. Nonetheless, these tools have many technical and clinical limitations as they are based on datasets with concrete characteristics. This proposition paper describes a novel architecture focused on providing a validation framework for discrimination and prediction models in the screening of Type 2 diabetes. For that, the architecture has been designed to gather different data sources under a common data structure and, furthermore, to be controlled by a centralized component (Orchestrator) in charge of directing the interaction flows among data sources, models and graphical user interfaces. This innovative approach aims to overcome the data-dependency of the models by providing a validation framework for the models as they are used within clinical settings.

  19. Desiderata for healthcare integrated data repositories based on architectural comparison of three public repositories.

    Science.gov (United States)

    Huser, Vojtech; Cimino, James J

    2013-01-01

    Integrated data repositories (IDRs) are indispensable tools for numerous biomedical research studies. We compare three large IDRs (Informatics for Integrating Biology and the Bedside (i2b2), HMO Research Network's Virtual Data Warehouse (VDW) and Observational Medical Outcomes Partnership (OMOP) repository) in order to identify common architectural features that enable efficient storage and organization of large amounts of clinical data. We define three high-level classes of underlying data storage models and we analyze each repository using this classification. We look at how a set of sample facts is represented in each repository and conclude with a list of desiderata for IDRs that deal with the information storage model, terminology model, data integration and value-sets management.

  20. Setting priorities for ambient air quality objectives

    International Nuclear Information System (INIS)

    2004-10-01

    Alberta has ambient air quality objectives in place for several pollutants, toxic substances and other air quality parameters. A process is in place to determine if additional air quality objectives are required or if existing objectives should be changed. In order to identify the highest priority substances that may require an ambient air quality objective to protect ecosystems and public health, a rigorous, transparent and cost effective priority setting methodology is required. This study reviewed, analyzed and assessed successful priority setting techniques used by other jurisdictions. It proposed an approach for setting ambient air quality objective priorities that integrates the concerns of stakeholders with Alberta Environment requirements. A literature and expert review were used to examine existing priority-setting techniques used by other jurisdictions. An analysis process was developed to identify the strengths and weaknesses of various techniques and their ability to take into account the complete pathway between chemical emissions and damage to human health or the environment. The key strengths and weaknesses of each technique were identified. Based on the analysis, the most promising technique was the tool for the reduction and assessment of chemical and other environmental impacts (TRACI). Several considerations for using TRACI to help set priorities for ambient air quality objectives were also presented. 26 refs, 8 tabs., 4 appendices

  1. The Distraction in Action Tool©: Feasibility and Usability in Clinical Settings.

    Science.gov (United States)

    Hanrahan, Kirsten; Kleiber, Charmaine; Miller, Ben J; Davis, Heather; McCarthy, Ann Marie

    2017-11-10

    Distraction is a relatively simple, evidence-based intervention to minimize child distress during medical procedures. Timely on-site interventions that instruct parents on distraction coaching are needed. The purpose of this study was to test the feasibility and usability of the Distraction in Action Tool© (DAT©), which 1) predicts child risk for distress with a needle stick and 2) provides individualized instructions for parents on how to be a distraction coach for their child in clinical settings. A mixed-methods descriptive design was used to test feasibility and usability of DAT in the Emergency Department and a Phlebotomy Lab at a large Midwest Academic Medical Center. Twenty parents of children ages 4-10years requiring venipuncture and clinicians performing 13 of those procedures participated. Participants completed an evaluation and participated in a brief interview. The average age of the children was 6.8years, and 80% of parent participants were mothers. Most parents reported the DAT was not difficult to use (84.2%), understandable (100%), and they had a positive experience (89.5%). Clinicians thought DAT was helpful (100%) and did not cause a meaningful delay in workflow (92%). DAT can be used by parents and clinicians to assess their children's risk for procedure related distress and learn distraction techniques to help their children during needle stick procedures. DAT for parents is being disseminated via social media and an open-access website. Further research is needed to disseminate and implement DAT in community healthcare settings. Copyright © 2017. Published by Elsevier Inc.

  2. Use of an Integrated Pest Management Assessment Administered through Turningpoint as an Educational, Needs Assessment, and Evaluation Tool

    Science.gov (United States)

    Stahl, Lizabeth A. B.; Behnken, Lisa M.; Breitenbach, Fritz R.; Miller, Ryan P.; Nicolai, David; Gunsolus, Jeffrey L.

    2016-01-01

    University of Minnesota educators use an integrated pest management (IPM) survey conducted during private pesticide applicator training as an educational, needs assessment, and evaluation tool. By incorporating the IPM Assessment, as the survey is called, into a widely attended program and using TurningPoint audience response devices, Extension…

  3. Developing Sterile Insect Technique (SIT) as a tool Mosquito Control Districts can use for integrated Aedes aegypti control

    Science.gov (United States)

    New tools are clearly needed for integrated mosquito management of Ae. aegypti. We describe the sterile insect technique (SIT) that we are developing as a method to control Ae. aegypti by partnering with two prominent Florida mosquito control districts (MCD) and the FAO/IAEA Insect Pest Control Sub...

  4. Development and initial feasibility of an organizational measure of behavioral health integration in medical care settings.

    Science.gov (United States)

    McGovern, Mark P; Urada, Darren; Lambert-Harris, Chantal; Sullivan, Steven T; Mazade, Noel A

    2012-12-01

    In the advent of health care reform, models are sought to integrate behavioral health and routine medical care services. Historically, the behavioral health specialty has not itself been integrated, but instead bifurcated by substance use and mental health across treatment systems, care providers and even research. With the present opportunity to transform the health care delivery system, it is incumbent upon policymakers, researchers and clinicians to avoid repeating this historical error, and provide integrated behavioral health services in medical contexts. An organizational measure designed to assess this capacity is described: the Dual Diagnosis Capability in Health Care Settings (DDCHCS). The DDCHCS was used to assess a sample of federally-qualified health centers (N=13) on the degree of behavioral health integration. The measure was found to be feasible and sensitive to detecting variation in integrated behavioral health services capacity. Three of the 13 agencies were dual diagnosis capable, with significant variation in DDCHCS dimensions measuring staffing, treatment practices and program milieu. In general, mental health services were more integrated than substance use. Future research should consider a revised version of the measure, a larger and more representative sample, and linking organizational capacity with patient outcomes. Copyright © 2012. Published by Elsevier Inc.

  5. Towards the integration, annotation and association of historical microarray experiments with RNA-seq.

    Science.gov (United States)

    Chavan, Shweta S; Bauer, Michael A; Peterson, Erich A; Heuck, Christoph J; Johann, Donald J

    2013-01-01

    Transcriptome analysis by microarrays has produced important advances in biomedicine. For instance in multiple myeloma (MM), microarray approaches led to the development of an effective disease subtyping via cluster assignment, and a 70 gene risk score. Both enabled an improved molecular understanding of MM, and have provided prognostic information for the purposes of clinical management. Many researchers are now transitioning to Next Generation Sequencing (NGS) approaches and RNA-seq in particular, due to its discovery-based nature, improved sensitivity, and dynamic range. Additionally, RNA-seq allows for the analysis of gene isoforms, splice variants, and novel gene fusions. Given the voluminous amounts of historical microarray data, there is now a need to associate and integrate microarray and RNA-seq data via advanced bioinformatic approaches. Custom software was developed following a model-view-controller (MVC) approach to integrate Affymetrix probe set-IDs, and gene annotation information from a variety of sources. The tool/approach employs an assortment of strategies to integrate, cross reference, and associate microarray and RNA-seq datasets. Output from a variety of transcriptome reconstruction and quantitation tools (e.g., Cufflinks) can be directly integrated, and/or associated with Affymetrix probe set data, as well as necessary gene identifiers and/or symbols from a diversity of sources. Strategies are employed to maximize the annotation and cross referencing process. Custom gene sets (e.g., MM 70 risk score (GEP-70)) can be specified, and the tool can be directly assimilated into an RNA-seq pipeline. A novel bioinformatic approach to aid in the facilitation of both annotation and association of historic microarray data, in conjunction with richer RNA-seq data, is now assisting with the study of MM cancer biology.

  6. Integrating neuroinformatics tools in TheVirtualBrain

    Directory of Open Access Journals (Sweden)

    M Marmaduke Woodman

    2014-04-01

    Full Text Available TheVirtualBrain (TVB is a neuroinformatics Python package representing theconvergence of clinical, systems, and theoretical neuroscience in the analysis,visualization and modeling of neural and neuroimaging dynamics. TVB iscomposed of a flexible simulator for neural dynamics measured across scalesfrom local populations to large-scale dynamics measured byelectroencephalography (EEG, magnetoencephalography (MEG and functionalmagnetic resonance imaging (fMRI, and core analytic and visualizationfunctions, all accessible through a web browser user interface. A datatypesystem modeling neuroscientific data ties together these pieces with persistentdata storage, based on a combination of SQL & HDF5. These datatypes combinewith adapters allowing TVB to integrate other algorithms or computationalsystems. TVB provides infrastructure for multiple projects and multiple users,possibly participating under multiple roles. For example, a clinician mightimport patient data to identify several potential lesion points in thepatient's connectome. A modeler, working on the same project, tests thesepoints for viability through whole brain simulation, based on the patient'sconnectome, and subsequent analysis of dynamical features. TVB also drivesresearch forward: the simulator itself represents the culmination of severalsimulation frameworks in the modeling literature. The availability of thenumerical methods, set of neural mass models and forward solutions allows forthe construction of a wide range of brain-scale simulation scenarios. Thispaper briefly outlines the history and motivation for TVB, describing theframework and simulator, giving usage examples in the web UI and Pythonscripting.

  7. Using Advanced Data Mining And Integration In Environmental Prediction Scenarios

    Directory of Open Access Journals (Sweden)

    Habala Ondrej

    2012-01-01

    Full Text Available We present one of the meteorological and hydrological experiments performed in the FP7 project ADMIRE. It serves as an experimental platform for hydrologists, and we have used it also as a testing platform for a suite of advanced data integration and data mining (DMI tools, developed within ADMIRE. The idea of ADMIRE is to develop an advanced DMI platform accessible even to users who are not familiar with data mining techniques. To this end, we have designed a novel DMI architecture, supported by a set of software tools, managed by DMI process descriptions written in a specialized high-level DMI language called DISPEL, and controlled via several different user interfaces, each performing a different set of tasks and targeting different user group.

  8. APMS: An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)

    1997-01-01

    statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  9. Catalyst synthesis and evaluation using an integrated atomic layer deposition synthesis–catalysis testing tool

    International Nuclear Information System (INIS)

    Camacho-Bunquin, Jeffrey; Shou, Heng; Marshall, Christopher L.; Aich, Payoli; Beaulieu, David R.; Klotzsch, Helmut; Bachman, Stephen; Hock, Adam; Stair, Peter

    2015-01-01

    An integrated atomic layer deposition synthesis-catalysis (I-ALD-CAT) tool was developed. It combines an ALD manifold in-line with a plug-flow reactor system for the synthesis of supported catalytic materials by ALD and immediate evaluation of catalyst reactivity using gas-phase probe reactions. The I-ALD-CAT delivery system consists of 12 different metal ALD precursor channels, 4 oxidizing or reducing agents, and 4 catalytic reaction feeds to either of the two plug-flow reactors. The system can employ reactor pressures and temperatures in the range of 10 −3 to 1 bar and 300–1000 K, respectively. The instrument is also equipped with a gas chromatograph and a mass spectrometer unit for the detection and quantification of volatile species from ALD and catalytic reactions. In this report, we demonstrate the use of the I-ALD-CAT tool for the synthesis of platinum active sites and Al 2 O 3 overcoats, and evaluation of catalyst propylene hydrogenation activity

  10. Catalyst synthesis and evaluation using an integrated atomic layer deposition synthesis–catalysis testing tool

    Energy Technology Data Exchange (ETDEWEB)

    Camacho-Bunquin, Jeffrey; Shou, Heng; Marshall, Christopher L. [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Aich, Payoli [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Department of Chemical Engineering, University of Illinois at Chicago, Chicago, Illinois 60607 (United States); Beaulieu, David R.; Klotzsch, Helmut; Bachman, Stephen [Arradiance Inc., Sudbury, Massachusetts 01776 (United States); Hock, Adam [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Department of Chemistry, Illinois Institute of Technology, Chicago, Illinois 60616 (United States); Stair, Peter [Chemical Sciences and Engineering Division, Argonne National Laboratory, Lemont, Illinois 60439 (United States); Department of Chemistry, Northwestern University, Evanston, Illinois 60208 (United States)

    2015-08-15

    An integrated atomic layer deposition synthesis-catalysis (I-ALD-CAT) tool was developed. It combines an ALD manifold in-line with a plug-flow reactor system for the synthesis of supported catalytic materials by ALD and immediate evaluation of catalyst reactivity using gas-phase probe reactions. The I-ALD-CAT delivery system consists of 12 different metal ALD precursor channels, 4 oxidizing or reducing agents, and 4 catalytic reaction feeds to either of the two plug-flow reactors. The system can employ reactor pressures and temperatures in the range of 10{sup −3} to 1 bar and 300–1000 K, respectively. The instrument is also equipped with a gas chromatograph and a mass spectrometer unit for the detection and quantification of volatile species from ALD and catalytic reactions. In this report, we demonstrate the use of the I-ALD-CAT tool for the synthesis of platinum active sites and Al{sub 2}O{sub 3} overcoats, and evaluation of catalyst propylene hydrogenation activity.

  11. FFI: A software tool for ecological monitoring

    Science.gov (United States)

    Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman

    2009-01-01

    A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...

  12. PULSim: User-Based Adaptable Simulation Tool for Railway Planning and Operations

    Directory of Open Access Journals (Sweden)

    Yong Cui

    2018-01-01

    Full Text Available Simulation methods are widely used in the field of railway planning and operations. Currently, several commercial software tools are available that not only provide functionality for railway simulation but also enable further evaluation and optimisation of the network for scheduling, dispatching, and capacity research. However, the various tools are all lacking with respect to the standards they utilise as well as their published interfaces. For an end-user, the basic mechanism and the assumptions built into a simulation tool are unknown, which means that the true potential of these software tools is limited. One of the most critical issues is the lack of the ability of users to define a sophisticated workflow, integrated in several rounds of simulation with adjustable parameters and settings. This paper develops and describes a user-based, customisable platform. As the preconditions of the platform, the design aspects for modelling the components of a railway system and building the workflow of railway simulation are elaborated in detail. Based on the model and the workflow, an integrated simulation platform with open interfaces is developed. Users and researchers gain the ability to rapidly develop their own algorithms, supported by the tailored simulation process in a flexible manner. The productivity of using simulation tools for further evaluation and optimisation will be significantly improved through the user-adaptable open interfaces.

  13. The Integrated Waste Tracking Systems (IWTS) - A Comprehensive Waste Management Tool

    Energy Technology Data Exchange (ETDEWEB)

    Robert S. Anderson

    2005-09-01

    The US Department of Energy (DOE) Idaho National Laboratory (INL) site located near Idaho Falls, ID USA, has developed a comprehensive waste management and tracking tool that integrates multiple operational activities with characterization data from waste declaration through final waste disposition. The Integrated Waste Tracking System (IWTS) provides information necessary to help facility personnel properly manage their waste and demonstrate a wide range of legal and regulatory compliance. As a client?server database system, the IWTS is a proven tracking, characterization, compliance, and reporting tool that meets the needs of both operations and management while providing a high level of flexibility. This paper describes some of the history involved with the development and current use of IWTS as a comprehensive waste management tool as well as a discussion of IWTS deployments performed by the INL for outside clients. Waste management spans a wide range of activities including: work group interactions, regulatory compliance management, reporting, procedure management, and similar activities. The IWTS documents these activities and performs tasks in a computer-automated environment. Waste characterization data, container characterization data, shipments, waste processing, disposals, reporting, and limit compliance checks are just a few of the items that IWTS documents and performs to help waste management personnel perform their jobs. Throughout most hazardous and radioactive waste generating, storage and disposal sites, waste management is performed by many different groups of people in many facilities. Several organizations administer their areas of waste management using their own procedures and documentation independent of other organizations. Files are kept, some of which are treated as quality records, others not as stringent. Quality records maintain a history of: changes performed after approval, the reason for the change(s), and a record of whom and when

  14. Organizational contextual features that influence the implementation of evidence-based practices across healthcare settings: a systematic integrative review.

    Science.gov (United States)

    Li, Shelly-Anne; Jeffs, Lianne; Barwick, Melanie; Stevens, Bonnie

    2018-05-05

    Organizational contextual features have been recognized as important determinants for implementing evidence-based practices across healthcare settings for over a decade. However, implementation scientists have not reached consensus on which features are most important for implementing evidence-based practices. The aims of this review were to identify the most commonly reported organizational contextual features that influence the implementation of evidence-based practices across healthcare settings, and to describe how these features affect implementation. An integrative review was undertaken following literature searches in CINAHL, MEDLINE, PsycINFO, EMBASE, Web of Science, and Cochrane databases from January 2005 to June 2017. English language, peer-reviewed empirical studies exploring organizational context in at least one implementation initiative within a healthcare setting were included. Quality appraisal of the included studies was performed using the Mixed Methods Appraisal Tool. Inductive content analysis informed data extraction and reduction. The search generated 5152 citations. After removing duplicates and applying eligibility criteria, 36 journal articles were included. The majority (n = 20) of the study designs were qualitative, 11 were quantitative, and 5 used a mixed methods approach. Six main organizational contextual features (organizational culture; leadership; networks and communication; resources; evaluation, monitoring and feedback; and champions) were most commonly reported to influence implementation outcomes in the selected studies across a wide range of healthcare settings. We identified six organizational contextual features that appear to be interrelated and work synergistically to influence the implementation of evidence-based practices within an organization. Organizational contextual features did not influence implementation efforts independently from other features. Rather, features were interrelated and often influenced each

  15. Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program.

    Science.gov (United States)

    Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita

    2016-03-23

    Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.

  16. Evaluating the Auto-MODS Assay, a Novel Tool for Tuberculosis Diagnosis for Use in Resource-Limited Settings

    Science.gov (United States)

    Wang, Linwei; Mohammad, Sohaib H.; Li, Qiaozhi; Rienthong, Somsak; Rienthong, Dhanida; Nedsuwan, Supalert; Mahasirimongkol, Surakameth; Yasui, Yutaka

    2014-01-01

    There is an urgent need for simple, rapid, and affordable diagnostic tests for tuberculosis (TB) to combat the great burden of the disease in developing countries. The microscopic observation drug susceptibility assay (MODS) is a promising tool to fill this need, but it is not widely used due to concerns regarding its biosafety and efficiency. This study evaluated the automated MODS (Auto-MODS), which operates on principles similar to those of MODS but with several key modifications, making it an appealing alternative to MODS in resource-limited settings. In the operational setting of Chiang Rai, Thailand, we compared the performance of Auto-MODS with the gold standard liquid culture method in Thailand, mycobacterial growth indicator tube (MGIT) 960 plus the SD Bioline TB Ag MPT64 test, in terms of accuracy and efficiency in differentiating TB and non-TB samples as well as distinguishing TB and multidrug-resistant (MDR) TB samples. Sputum samples from clinically diagnosed TB and non-TB subjects across 17 hospitals in Chiang Rai were consecutively collected from May 2011 to September 2012. A total of 360 samples were available for evaluation, of which 221 (61.4%) were positive and 139 (38.6%) were negative for mycobacterial cultures according to MGIT 960. Of the 221 true-positive samples, Auto-MODS identified 212 as positive and 9 as negative (sensitivity, 95.9%; 95% confidence interval [CI], 92.4% to 98.1%). Of the 139 true-negative samples, Auto-MODS identified 135 as negative and 4 as positive (specificity, 97.1%; 95% CI, 92.8% to 99.2%). The median time to culture positivity was 10 days, with an interquartile range of 8 to 13 days for Auto-MODS. Auto-MODS is an effective and cost-sensitive alternative diagnostic tool for TB diagnosis in resource-limited settings. PMID:25378569

  17. Outdoor environmental assessment of attention promoting settings for preschool children.

    Science.gov (United States)

    Mårtensson, F; Boldemann, C; Söderström, M; Blennow, M; Englund, J-E; Grahn, P

    2009-12-01

    The restorative potential of green outdoor environments for children in preschool settings was investigated by measuring the attention of children playing in settings with different environmental features. Eleven preschools with outdoor environments typical for the Stockholm area were assessed using the outdoor play environment categories (OPEC) and the fraction of visible sky from play structures (sky view factor), and 198 children, aged 4.5-6.5 years, were rated by the staff for inattentive, hyperactive and impulsive behaviors with the ECADDES tool. Children playing in large and integrated outdoor areas containing large areas of trees, shrubbery and a hilly terrain showed less often behaviors of inattention (pOPEC can be useful when to locate and develop health-promoting land adjacent to preschools.

  18. IDMT an integrated system to manage decommissioning activities

    International Nuclear Information System (INIS)

    Marsiletti, M.; Mini, G.; Orlandi, S.

    2003-01-01

    In the frame of decommissioning activities Ansaldo has developed a set of Integrated Decommissioning Management Tools (IDMT) addressed to dismantling work as well as to management of the wastes. The tools MIRAD and DECOM arise from the project of dismantling Italian NPPs (e.g. Caorso) as described in this paper. MIRAD is an integration between a 3 D CAD Model of the NPP in as build configuration and a computerized database (presently an MS Access application) which stores the information related to the radiological measurements detected through in field monitoring associated to any item present in the plant. DECOM is an integration system between a 3 D CAD Model of the NPP (as minimum for the controlled zone) in as-built configuration and a computerized database (presently an MS Access application) which stores the information associated to primary and secondary wastes produced during operation, dismantling or treatment activities.The IDMT system is currently used in the following NPPs in Italy: Caorso NPP (Mark II GE Containment BWR), Garigliano NPP (Dual Cycle GE BWR) and Trino NPP (Westinghouse PWR Plant). (authors)

  19. Rehabilitation-specific challenges and advantages in the integration of migrant physicians in Germany: a multiperspective qualitative interview study in rehabilitative settings.

    Science.gov (United States)

    Jansen, E; Hänel, P; Klingler, C

    2018-07-01

    In Germany, rehabilitative healthcare institutions increasingly rely on migrant physicians to meet their staffing needs. Yet until now, research on the integration of migrant physicians has focussed entirely on the acute care setting. This study is the first to address the specific advantages and challenges to integration in the field of rehabilitative medicine where a high number of migrant physicians work. From the experiences of migrant physicians and their colleagues, we provide actionable suggestions to counteract potential sources of conflict and thereby improve the integration of migrant physicians in the German workforce. We conducted a qualitative interview study. We conducted 23 interviews with a total of 26 participants occupying a variety of roles in two different rehabilitation centres (maximum variation sampling). Interviews were recorded, transcribed verbatim and parsed through thematic analysis. Our research revealed advantages and challenges to integration in three distinct areas: rehabilitative care institutions, competencies of migrant professionals and interpersonal relations. The first set of issues hinges on the work processes within rehabilitative hospitals, professional prospects there and the location of the institutions themselves. Second, migrant physicians may encounter difficulties because of limited linguistic skills and country-specific knowledge. And finally, aspects of their interactions with care teams and patients may constitute barriers to integration. Some of the factors influencing the integration of migrant physicians are the same in both rehabilitative and acute medicine, but the rehabilitative setting presents distinct advantages and challenges that are worthy of study in their own right. We outline several measures which could help overcome challenges to the integration of migrant physicians, including those associated with professional relationships. Further research is needed to develop concrete support programmes

  20. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  1. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  2. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    Science.gov (United States)

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  3. Signal Integrity Analysis of High-Speed Interconnects

    CERN Document Server

    Oltean Karlsson, A

    2007-01-01

    LHC detectors and future experiments will produce very large amount of data that will be transferred at multi-Gigabit speeds. At such data rates, signal-integrity effects become important and traditional rules of thumb are no longer enough for the design and layout of the traces. Simulations for signal-integrity effects at board level provide a way to study and validate several scenarios before arriving at a set of optimized design rules prior to building the actual printed circuit board (PCB). This article describes some of the available tools at CERN. Two case studies will be used to highlight the capabilities of these programs.

  4. Analysis Tool Web Services from the EMBL-EBI.

    Science.gov (United States)

    McWilliam, Hamish; Li, Weizhong; Uludag, Mahmut; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Cowley, Andrew Peter; Lopez, Rodrigo

    2013-07-01

    Since 2004 the European Bioinformatics Institute (EMBL-EBI) has provided access to a wide range of databases and analysis tools via Web Services interfaces. This comprises services to search across the databases available from the EMBL-EBI and to explore the network of cross-references present in the data (e.g. EB-eye), services to retrieve entry data in various data formats and to access the data in specific fields (e.g. dbfetch), and analysis tool services, for example, sequence similarity search (e.g. FASTA and NCBI BLAST), multiple sequence alignment (e.g. Clustal Omega and MUSCLE), pairwise sequence alignment and protein functional analysis (e.g. InterProScan and Phobius). The REST/SOAP Web Services (http://www.ebi.ac.uk/Tools/webservices/) interfaces to these databases and tools allow their integration into other tools, applications, web sites, pipeline processes and analytical workflows. To get users started using the Web Services, sample clients are provided covering a range of programming languages and popular Web Service tool kits, and a brief guide to Web Services technologies, including a set of tutorials, is available for those wishing to learn more and develop their own clients. Users of the Web Services are informed of improvements and updates via a range of methods.

  5. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  6. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    Science.gov (United States)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  7. An Integrated Tool for Low Thrust Optimal Control Orbit Transfers in Interplanetary Trajectories

    Science.gov (United States)

    Dargent, T.; Martinot, V.

    In the last recent years a significant progress has been made in optimal control orbit transfers using low thrust electrical propulsion for interplanetary missions. The system objective is always the same: decrease the transfer duration and increase the useful satellite mass. The optimum control strategy to perform the minimum time to orbit or the minimum fuel consumption requires the use of sophisticated mathematical tools, most of the time dedicated to a specific mission and therefore hardly reusable. To improve this situation and enable Alcatel Space to perform rather quick trajectory design as requested by mission analysis, we have developed a software tool T-3D dedicated to optimal control orbit transfers which integrates various initial and terminal rendezvous conditions - e.g. fixed arrival time for planet encounter - and engine thrust profiles -e.g. thrust law variation with respect to the distance to the Sun -. This single and quite versatile tool allows to perform analyses like minimum consumption for orbit insertions around a planet from an hyperbolic trajectory, interplanetary orbit transfers, low thrust minimum time multiple revolution orbit transfers, etc… From a mathematical point of view, the software relies on the minimum principle formulation to find the necessary conditions of optimality. The satellite dynamics is a two body model and relies of an equinoctial formulation of the Gauss equation. This choice has been made for numerical purpose and to solve more quickly the two point boundaries values problem. In order to handle the classical problem of co-state variables initialization, problems simpler than the actual one can be solved straight forward by the tool and the values of the co-state variables are kept as first guess for a more complex problem. Finally, a synthesis of the test cases is presented to illustrate the capacities of the tool, mixing examples of interplanetary mission, orbit insertion, multiple revolution orbit transfers

  8. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  9. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  10. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Oczkowski, Simon J; Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J

    2016-01-01

    Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25-4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43-2.59, pcare desired and care received by patients (RR 1.17, 95% CI 1.05-1.30, p = 0.004, low quality evidence, 2 RCTs). The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between

  11. Algal Functional Annotation Tool: a web-based analysis suite to functionally interpret large gene lists using integrated annotation and expression data

    Directory of Open Access Journals (Sweden)

    Merchant Sabeeha S

    2011-07-01

    Full Text Available Abstract Background Progress in genome sequencing is proceeding at an exponential pace, and several new algal genomes are becoming available every year. One of the challenges facing the community is the association of protein sequences encoded in the genomes with biological function. While most genome assembly projects generate annotations for predicted protein sequences, they are usually limited and integrate functional terms from a limited number of databases. Another challenge is the use of annotations to interpret large lists of 'interesting' genes generated by genome-scale datasets. Previously, these gene lists had to be analyzed across several independent biological databases, often on a gene-by-gene basis. In contrast, several annotation databases, such as DAVID, integrate data from multiple functional databases and reveal underlying biological themes of large gene lists. While several such databases have been constructed for animals, none is currently available for the study of algae. Due to renewed interest in algae as potential sources of biofuels and the emergence of multiple algal genome sequences, a significant need has arisen for such a database to process the growing compendiums of algal genomic data. Description The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of

  12. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    OpenAIRE

    Chie Takahashi; Simon J Watt

    2011-01-01

    Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009). Variations in tool geometry also affect the reliability (precision) of haptic size estimates, however, because they alter the change ...

  13. Modification site localization scoring integrated into a search engine.

    Science.gov (United States)

    Baker, Peter R; Trinidad, Jonathan C; Chalkley, Robert J

    2011-07-01

    Large proteomic data sets identifying hundreds or thousands of modified peptides are becoming increasingly common in the literature. Several methods for assessing the reliability of peptide identifications both at the individual peptide or data set level have become established. However, tools for measuring the confidence of modification site assignments are sparse and are not often employed. A few tools for estimating phosphorylation site assignment reliabilities have been developed, but these are not integral to a search engine, so require a particular search engine output for a second step of processing. They may also require use of a particular fragmentation method and are mostly only applicable for phosphorylation analysis, rather than post-translational modifications analysis in general. In this study, we present the performance of site assignment scoring that is directly integrated into the search engine Protein Prospector, which allows site assignment reliability to be automatically reported for all modifications present in an identified peptide. It clearly indicates when a site assignment is ambiguous (and if so, between which residues), and reports an assignment score that can be translated into a reliability measure for individual site assignments.

  14. The use of an integrated variable fuzzy sets in water resources management

    Science.gov (United States)

    Qiu, Qingtai; Liu, Jia; Li, Chuanzhe; Yu, Xinzhe; Wang, Yang

    2018-06-01

    Based on the evaluation of the present situation of water resources and the development of water conservancy projects and social economy, optimal allocation of regional water resources presents an increasing need in the water resources management. Meanwhile it is also the most effective way to promote the harmonic relationship between human and water. In view of the own limitations of the traditional evaluations of which always choose a single index model using in optimal allocation of regional water resources, on the basis of the theory of variable fuzzy sets (VFS) and system dynamics (SD), an integrated variable fuzzy sets model (IVFS) is proposed to address dynamically complex problems in regional water resources management in this paper. The model is applied to evaluate the level of the optimal allocation of regional water resources of Zoucheng in China. Results show that the level of allocation schemes of water resources ranging from 2.5 to 3.5, generally showing a trend of lower level. To achieve optimal regional management of water resources, this model conveys a certain degree of accessing water resources management, which prominently improve the authentic assessment of water resources management by using the eigenvector of level H.

  15. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    Energy Technology Data Exchange (ETDEWEB)

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  16. Application of the GEM Inventory Data Capture Tools for Dynamic Vulnerability Assessment and Recovery Modelling

    Science.gov (United States)

    Verrucci, Enrica; Bevington, John; Vicini, Alessandro

    2014-05-01

    A set of open-source tools to create building exposure datasets for seismic risk assessment was developed from 2010-13 by the Inventory Data Capture Tools (IDCT) Risk Global Component of the Global Earthquake Model (GEM). The tools were designed to integrate data derived from remotely-sensed imagery, statistically-sampled in-situ field data of buildings to generate per-building and regional exposure data. A number of software tools were created to aid the development of these data, including mobile data capture tools for in-field structural assessment, and the Spatial Inventory Data Developer (SIDD) for creating "mapping schemes" - statistically-inferred distributions of building stock applied to areas of homogeneous urban land use. These tools were made publically available in January 2014. Exemplar implementations in Europe and Central Asia during the IDCT project highlighted several potential application areas beyond the original scope of the project. These are investigated here. We describe and demonstrate how the GEM-IDCT suite can be used extensively within the framework proposed by the EC-FP7 project SENSUM (Framework to integrate Space-based and in-situ sENSing for dynamic vUlnerability and recovery Monitoring). Specifically, applications in the areas of 1) dynamic vulnerability assessment (pre-event), and 2) recovery monitoring and evaluation (post-event) are discussed. Strategies for using the IDC Tools for these purposes are discussed. The results demonstrate the benefits of using advanced technology tools for data capture, especially in a systematic fashion using the taxonomic standards set by GEM. Originally designed for seismic risk assessment, it is clear the IDCT tools have relevance for multi-hazard risk assessment. When combined with a suitable sampling framework and applied to multi-temporal recovery monitoring, data generated from the tools can reveal spatio-temporal patterns in the quality of recovery activities and resilience trends can be

  17. Case Studies in Environment Integration

    Science.gov (United States)

    1991-12-01

    such as CADRE Teamwork and Frame Technology FrameMaker , are integrated. Future plans include integrating additional software development tools into...Pictures, Sabre C, and Interleaf or FrameMaker . Cad- re Technologies has announced integration agreements with Saber C and Pansophic, as well as offering...access to the Interleaf and FrameMaker documentation tools. While some of the current agreements between vendors to create tool coalitions are

  18. Next-generation phage display: integrating and comparing available molecular tools to enable cost-effective high-throughput analysis.

    Directory of Open Access Journals (Sweden)

    Emmanuel Dias-Neto

    2009-12-01

    Full Text Available Combinatorial phage display has been used in the last 20 years in the identification of protein-ligands and protein-protein interactions, uncovering relevant molecular recognition events. Rate-limiting steps of combinatorial phage display library selection are (i the counting of transducing units and (ii the sequencing of the encoded displayed ligands. Here, we adapted emerging genomic technologies to minimize such challenges.We gained efficiency by applying in tandem real-time PCR for rapid quantification to enable bacteria-free phage display library screening, and added phage DNA next-generation sequencing for large-scale ligand analysis, reporting a fully integrated set of high-throughput quantitative and analytical tools. The approach is far less labor-intensive and allows rigorous quantification; for medical applications, including selections in patients, it also represents an advance for quantitative distribution analysis and ligand identification of hundreds of thousands of targeted particles from patient-derived biopsy or autopsy in a longer timeframe post library administration. Additional advantages over current methods include increased sensitivity, less variability, enhanced linearity, scalability, and accuracy at much lower cost. Sequences obtained by qPhage plus pyrosequencing were similar to a dataset produced from conventional Sanger-sequenced transducing-units (TU, with no biases due to GC content, codon usage, and amino acid or peptide frequency. These tools allow phage display selection and ligand analysis at >1,000-fold faster rate, and reduce costs approximately 250-fold for generating 10(6 ligand sequences.Our analyses demonstrates that whereas this approach correlates with the traditional colony-counting, it is also capable of a much larger sampling, allowing a faster, less expensive, more accurate and consistent analysis of phage enrichment. Overall, qPhage plus pyrosequencing is superior to TU-counting plus Sanger

  19. Ergonomics action research II: a framework for integrating HF into work system design.

    Science.gov (United States)

    Neumann, W P; Village, J

    2012-01-01

    This paper presents a conceptual framework that can support efforts to integrate human factors (HF) into the work system design process, where improved and cost-effective application of HF is possible. The framework advocates strategies of broad stakeholder participation, linking of performance and health goals, and process focussed change tools that can help practitioners engage in improvements to embed HF into a firm's work system design process. Recommended tools include business process mapping of the design process, implementing design criteria, using cognitive mapping to connect to managers' strategic goals, tactical use of training and adopting virtual HF (VHF) tools to support the integration effort. Consistent with organisational change research, the framework provides guidance but does not suggest a strict set of steps. This allows more adaptability for the practitioner who must navigate within a particular organisational context to secure support for embedding HF into the design process for improved operator wellbeing and system performance. There has been little scientific literature about how a practitioner might integrate HF into a company's work system design process. This paper proposes a framework for this effort by presenting a coherent conceptual framework, process tools, design tools and procedural advice that can be adapted for a target organisation.

  20. Development of Reliable and Validated Tools to Evaluate Technical Resuscitation Skills in a Pediatric Simulation Setting: Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics.

    Science.gov (United States)

    Faudeux, Camille; Tran, Antoine; Dupont, Audrey; Desmontils, Jonathan; Montaudié, Isabelle; Bréaud, Jean; Braun, Marc; Fournier, Jean-Paul; Bérard, Etienne; Berlengi, Noémie; Schweitzer, Cyril; Haas, Hervé; Caci, Hervé; Gatin, Amélie; Giovannini-Chami, Lisa

    2017-09-01

    To develop a reliable and validated tool to evaluate technical resuscitation skills in a pediatric simulation setting. Four Resuscitation and Emergency Simulation Checklist for Assessment in Pediatrics (RESCAPE) evaluation tools were created, following international guidelines: intraosseous needle insertion, bag mask ventilation, endotracheal intubation, and cardiac massage. We applied a modified Delphi methodology evaluation to binary rating items. Reliability was assessed comparing the ratings of 2 observers (1 in real time and 1 after a video-recorded review). The tools were assessed for content, construct, and criterion validity, and for sensitivity to change. Inter-rater reliability, evaluated with Cohen kappa coefficients, was perfect or near-perfect (>0.8) for 92.5% of items and each Cronbach alpha coefficient was ≥0.91. Principal component analyses showed that all 4 tools were unidimensional. Significant increases in median scores with increasing levels of medical expertise were demonstrated for RESCAPE-intraosseous needle insertion (P = .0002), RESCAPE-bag mask ventilation (P = .0002), RESCAPE-endotracheal intubation (P = .0001), and RESCAPE-cardiac massage (P = .0037). Significantly increased median scores over time were also demonstrated during a simulation-based educational program. RESCAPE tools are reliable and validated tools for the evaluation of technical resuscitation skills in pediatric settings during simulation-based educational programs. They might also be used for medical practice performance evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. An approach for obtaining integrable Hamiltonians from Poisson-commuting polynomial families

    Science.gov (United States)

    Leyvraz, F.

    2017-07-01

    We discuss a general approach permitting the identification of a broad class of sets of Poisson-commuting Hamiltonians, which are integrable in the sense of Liouville. It is shown that all such Hamiltonians can be solved explicitly by a separation of variables ansatz. The method leads in particular to a proof that the so-called "goldfish" Hamiltonian is maximally superintegrable and leads to an elementary identification of a full set of integrals of motion. The Hamiltonians in involution with the "goldfish" Hamiltonian are also explicitly integrated. New integrable Hamiltonians are identified, among which some have the property of being isochronous, that is, all their orbits have the same period. Finally, a peculiar structure is identified in the Poisson brackets between the elementary symmetric functions and the set of Hamiltonians commuting with the "goldfish" Hamiltonian: these can be expressed as products between elementary symmetric functions and Hamiltonians. The structure displays an invariance property with respect to one element and has both a symmetry and a closure property. The meaning of this structure is not altogether clear to the author, but it turns out to be a powerful tool.

  2. Evaluating patient care communication in integrated care settings: application of a mixed method approach in cerebral palsy programs

    NARCIS (Netherlands)

    Gulmans, J.; Gulmans, J.; Vollenbroek-Hutten, Miriam Marie Rosé; van Gemert-Pijnen, Julia E.W.C.; van Harten, Willem H.

    2009-01-01

    Objective. In this study, we evaluated patient care communication in the integrated care setting of children with cerebral palsy in three Dutch regions in order to identify relevant communication gaps experienced by both parents and involved professionals. - Design. A three-step mixed method

  3. Structural Integrity Analysis considered Load Combination for the Conceptual Design of Korean HCCR TBM-set

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won; Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon [KAERI, Daejeon (Korea, Republic of); Shin, Kyu In [Gentec Tech, Daejeon (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    HCCR TBM (Test Blanket Module) set is consist of 4-TBM sub module, one blanket manifold (BM), a shield, and 4-key, which has a function of a connection between BM and the shield. And it shall be installed in the equatorial port No.18 of ITER inside the vacuum vessel directly facing the plasma and shall be cooled by a high-temperature helium coolant. In addition, the HCCR TBM-set safety classification follows the ITER (international thermonuclear reactor) safety importance class (SIC) criteria, and satisfies a design requirement according to RCC-MRx. In this study, some of load combination (LC) analysis for the structure integrity of TBM set were carried out based on the reference. And the LC results showed that they satisfied the design requirement. The material of TBM-set was used from the reference, and RCC-MRx for the stress analysis. In this study, the load combination results were met a design requirement. But some load combination case gave a higher maximum stress value than a design requirement and in these case the stress breakdown analysis according to RCC-MRx was performed, and the result were satisfied the design requirement.

  4. Integrating neuroinformatics tools in TheVirtualBrain.

    Science.gov (United States)

    Woodman, M Marmaduke; Pezard, Laurent; Domide, Lia; Knock, Stuart A; Sanz-Leon, Paula; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor

    2014-01-01

    TheVirtualBrain (TVB) is a neuroinformatics Python package representing the convergence of clinical, systems, and theoretical neuroscience in the analysis, visualization and modeling of neural and neuroimaging dynamics. TVB is composed of a flexible simulator for neural dynamics measured across scales from local populations to large-scale dynamics measured by electroencephalography (EEG), magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI), and core analytic and visualization functions, all accessible through a web browser user interface. A datatype system modeling neuroscientific data ties together these pieces with persistent data storage, based on a combination of SQL and HDF5. These datatypes combine with adapters allowing TVB to integrate other algorithms or computational systems. TVB provides infrastructure for multiple projects and multiple users, possibly participating under multiple roles. For example, a clinician might import patient data to identify several potential lesion points in the patient's connectome. A modeler, working on the same project, tests these points for viability through whole brain simulation, based on the patient's connectome, and subsequent analysis of dynamical features. TVB also drives research forward: the simulator itself represents the culmination of several simulation frameworks in the modeling literature. The availability of the numerical methods, set of neural mass models and forward solutions allows for the construction of a wide range of brain-scale simulation scenarios. This paper briefly outlines the history and motivation for TVB, describing the framework and simulator, giving usage examples in the web UI and Python scripting.

  5. Risk assessment tools to identify women with increased risk of osteoporotic fracture. Complexity or simplicity?

    DEFF Research Database (Denmark)

    Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille

    2013-01-01

    A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview...... of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance each tool was sufficient for practical use and lastly to examine whether the complexity of the tools influenced their discriminative power. We searched Pub......Med, Embase and Cochrane databases for papers and evaluated these with respect to methodological quality using the QUADAS checklist. A total of 48 tools were identified, 20 had been externally validated, however only 6 tools had been tested more than once in a population-based setting with acceptable...

  6. Anaphe - OO libraries and tools for data analysis

    International Nuclear Information System (INIS)

    Couet, O.; Ferrero-Merlino, B.; Molnar, Z.; Moscicki, J.T.; Pfeiffer, A.; Sang, M.

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, the authors have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create 'shadow classes' for the Python language, which map the definitions of the Abstract Interfaces almost at a one-to-one level. The authors will give an overview of the architecture and design choices and will present the current status and future developments of the project

  7. A knowledge transfer scheme to bridge the gap between science and practice: an integration of existing research frameworks into a tool for practice.

    Science.gov (United States)

    Verhagen, Evert; Voogt, Nelly; Bruinsma, Anja; Finch, Caroline F

    2014-04-01

    Evidence of effectiveness does not equal successful implementation. To progress the field, practical tools are needed to bridge the gap between research and practice and to truly unite effectiveness and implementation evidence. This paper describes the Knowledge Transfer Scheme integrating existing implementation research frameworks into a tool which has been developed specifically to bridge the gap between knowledge derived from research on the one side and evidence-based usable information and tools for practice on the other.

  8. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  9. Toward a VPH/Physiome ToolKit.

    Science.gov (United States)

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  10. Quality assurance tool for organ at risk delineation in radiation therapy using a parametric statistical approach.

    Science.gov (United States)

    Hui, Cheukkai B; Nourzadeh, Hamidreza; Watkins, William T; Trifiletti, Daniel M; Alonso, Clayton E; Dutta, Sunil W; Siebers, Jeffrey V

    2018-02-26

    To develop a quality assurance (QA) tool that identifies inaccurate organ at risk (OAR) delineations. The QA tool computed volumetric features from prior OAR delineation data from 73 thoracic patients to construct a reference database. All volumetric features of the OAR delineation are computed in three-dimensional space. Volumetric features of a new OAR are compared with respect to those in the reference database to discern delineation outliers. A multicriteria outlier detection system warns users of specific delineation outliers based on combinations of deviant features. Fifteen independent experimental sets including automatic, propagated, and clinically approved manual delineation sets were used for verification. The verification OARs included manipulations to mimic common errors. Three experts reviewed the experimental sets to identify and classify errors, first without; and then 1 week after with the QA tool. In the cohort of manual delineations with manual manipulations, the QA tool detected 94% of the mimicked errors. Overall, it detected 37% of the minor and 85% of the major errors. The QA tool improved reviewer error detection sensitivity from 61% to 68% for minor errors (P = 0.17), and from 78% to 87% for major errors (P = 0.02). The QA tool assists users to detect potential delineation errors. QA tool integration into clinical procedures may reduce the frequency of inaccurate OAR delineation, and potentially improve safety and quality of radiation treatment planning. © 2018 American Association of Physicists in Medicine.

  11. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  12. Integration of oncology and palliative care: setting a benchmark.

    Science.gov (United States)

    Vayne-Bossert, P; Richard, E; Good, P; Sullivan, K; Hardy, J R

    2017-10-01

    Integration of oncology and palliative care (PC) should be the standard model of care for patients with advanced cancer. An expert panel developed criteria that constitute integration. This study determined whether the PC service within this Health Service, which is considered to be fully "integrated", could be benchmarked against these criteria. A survey was undertaken to determine the perceived level of integration of oncology and palliative care by all health care professionals (HCPs) within our cancer centre. An objective determination of integration was obtained from chart reviews of deceased patients. Integration was defined as >70% of all respondents answered "agree" or "strongly agree" to each indicator and >70% of patient charts supported each criteria. Thirty-four HCPs participated in the survey (response rate 69%). Over 90% were aware of the outpatient PC clinic, interdisciplinary and consultation team, PC senior leadership, and the acceptance of concurrent anticancer therapy. None of the other criteria met the 70% agreement mark but many respondents lacked the necessary knowledge to respond. The chart review included 67 patients, 92% of whom were seen by the PC team prior to death. The median time from referral to death was 103 days (range 0-1347). The level of agreement across all criteria was below our predefined definition of integration. The integration criteria relating to service delivery are medically focused and do not lend themselves to interdisciplinary review. The objective criteria can be audited and serve both as a benchmark and a basis for improvement activities.

  13. Set-Pi: Set Membership pi-Calculus

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Mödersheim, Sebastian Alexander; Nielson, Flemming

    2015-01-01

    Communication protocols often rely on stateful mechanisms to ensure certain security properties. For example, counters and timestamps can be used to ensure authentication, or the security of communication can depend on whether a particular key is registered to a server or it has been revoked. Pro......Verif, like other state of the art tools for protocol analysis, achieves good performance by converting a formal protocol specification into a set of Horn clauses, that represent a monotonically growing set of facts that a Dolev-Yao attacker can derive from the system. Since this set of facts is not state...... method with three examples, a simple authentication protocol based on counters, a key registration protocol, and a model of the Yubikey security device....

  14. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  15. Approaches, tools and methods used for setting priorities in health research in the 21(st) century.

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-06-01

    Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more

  16. Approaches, tools and methods used for setting priorities in health research in the 21st century

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be

  17. Setting value optimization method in integration for relay protection based on improved quantum particle swarm optimization algorithm

    Science.gov (United States)

    Yang, Guo Sheng; Wang, Xiao Yang; Li, Xue Dong

    2018-03-01

    With the establishment of the integrated model of relay protection and the scale of the power system expanding, the global setting and optimization of relay protection is an extremely difficult task. This paper presents a kind of application in relay protection of global optimization improved particle swarm optimization algorithm and the inverse time current protection as an example, selecting reliability of the relay protection, selectivity, quick action and flexibility as the four requires to establish the optimization targets, and optimizing protection setting values of the whole system. Finally, in the case of actual power system, the optimized setting value results of the proposed method in this paper are compared with the particle swarm algorithm. The results show that the improved quantum particle swarm optimization algorithm has strong search ability, good robustness, and it is suitable for optimizing setting value in the relay protection of the whole power system.

  18. Laccase-13 Regulates Seed Setting Rate by Affecting Hydrogen Peroxide Dynamics and Mitochondrial Integrity in Rice

    Directory of Open Access Journals (Sweden)

    Yang Yu

    2017-07-01

    Full Text Available Seed setting rate is one of the most important components of rice grain yield. To date, only several genes regulating setting rate have been identified in plant. In this study, we showed that laccase-13 (OsLAC13, a member of laccase family genes which are known for their roles in modulating phenylpropanoid pathway and secondary lignification in cell wall, exerts a regulatory function in rice seed setting rate. OsLAC13 expressed in anthers and promotes hydrogen peroxide production both in vitro and in the filaments and anther connectives. Knock-out of OsLAC13 showed significantly increased seed setting rate, while overexpression of this gene exhibited induced mitochondrial damage and suppressed sugar transportation in anthers, which in turn affected seed setting rate. OsLAC13 also induced H2O2 production and mitochondrial damage in the root tip cells which caused the lethal phenotype. We also showed that high abundant of OsmiR397, the suppressor of OsLAC13 mRNA, increased the seed setting rate of rice plants, and restrains H2O2 accumulation in roots during oxidative stress. Our results suggested a novel regulatory role of OsLAC13 gene in regulating seed setting rate by affecting H2O2 dynamics and mitochondrial integrity in rice.

  19. Tools and measures for stimulation the efficient energy consumption. Integrated resource planning in Romania

    International Nuclear Information System (INIS)

    Scripcariu, Daniela; Scripcariu, Mircea; Leca, Aureliu

    1996-01-01

    The integrated resource planning is based on analyses of the energy generation and energy consumption as a whole. Thus, increasing the energy efficiency appears to be the cheapest, the most available and the most cost-effective energy resource. In order to stimulate the increase of efficiency of energy consumption, besides economic efficiency criteria for selecting technical solutions, additional tools and measures are necessary. The paper presents the main tools and measures needed to foster an efficient energy consumption. Actions meant to stimulate DSM (Demand-Side Management) implementation in Romania are proposed. The paper contains 5 sections. In the introduction, the main aspects of the DSM are considered, namely, where the programs are implemented, who is the responsible, which are the objectives and finally, how the DSM programs are implemented. The following tools in management of energy use are examined: the energy prices, the regulation in the field of energy efficiency, standards and norms, energy labelling of the products and energy education. Among the measures for managing the energy use, the paper takes into consideration the institutions responsible for DSM, for instance, the Romanian Agency for Energy Conservation (ARCE), decentralization of decision making, the program approaches and financing the actions aiming at improving the energy efficiency. Finally, the paper analyses the criteria in choosing adequate solutions of improving the energy efficiency

  20. Human factors and fuzzy set theory for safety analysis

    International Nuclear Information System (INIS)

    Nishiwaki, Y.

    1987-01-01

    Human reliability and performance is affected by many factors: medical, physiological and psychological, etc. The uncertainty involved in human factors may not necessarily be probabilistic, but fuzzy. Therefore, it is important to develop a theory by which both the non-probabilistic uncertainties, or fuzziness, of human factors and the probabilistic properties of machines can be treated consistently. In reality, randomness and fuzziness are sometimes mixed. From the mathematical point of view, probabilistic measures may be considered a special case of fuzzy measures. Therefore, fuzzy set theory seems to be an effective tool for analysing man-machine systems. The concept 'failure possibility' based on fuzzy sets is suggested as an approach to safety analysis and fault diagnosis of a large complex system. Fuzzy measures and fuzzy integrals are introduced and their possible applications are also discussed. (author)