WorldWideScience

Sample records for program analysis framework

  1. A framework for telehealth program evaluation.

    Science.gov (United States)

    Nepal, Surya; Li, Jane; Jang-Jaccard, Julian; Alem, Leila

    2014-04-01

    Evaluating telehealth programs is a challenging task, yet it is the most sensible first step when embarking on a telehealth study. How can we frame and report on telehealth studies? What are the health services elements to select based on the application needs? What are the appropriate terms to use to refer to such elements? Various frameworks have been proposed in the literature to answer these questions, and each framework is defined by a set of properties covering different aspects of telehealth systems. The most common properties include application, technology, and functionality. With the proliferation of telehealth, it is important not only to understand these properties, but also to define new properties to account for a wider range of context of use and evaluation outcomes. This article presents a comprehensive framework for delivery design, implementation, and evaluation of telehealth services. We first survey existing frameworks proposed in the literature and then present our proposed comprehensive multidimensional framework for telehealth. Six key dimensions of the proposed framework include health domains, health services, delivery technologies, communication infrastructure, environment setting, and socioeconomic analysis. We define a set of example properties for each dimension. We then demonstrate how we have used our framework to evaluate telehealth programs in rural and remote Australia. A few major international studies have been also mapped to demonstrate the feasibility of the framework. The key characteristics of the framework are as follows: (a) loosely coupled and hence easy to use, (b) provides a basis for describing a wide range of telehealth programs, and (c) extensible to future developments and needs.

  2. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2010-01-01

    Get a thorough introduction to ADO.NET Entity Framework 4 -- Microsoft's core framework for modeling and interacting with data in .NET applications. The second edition of this acclaimed guide provides a hands-on tour of the framework latest version in Visual Studio 2010 and .NET Framework 4. Not only will you learn how to use EF4 in a variety of applications, you'll also gain a deep understanding of its architecture and APIs. Written by Julia Lerman, the leading independent authority on the framework, Programming Entity Framework covers it all -- from the Entity Data Model and Object Service

  3. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2009-01-01

    Programming Entity Framework is a thorough introduction to Microsoft's new core framework for modeling and interacting with data in .NET applications. This highly-acclaimed book not only gives experienced developers a hands-on tour of the Entity Framework and explains its use in a variety of applications, it also provides a deep understanding of its architecture and APIs -- knowledge that will be extremely valuable as you shift to the Entity Framework version in .NET Framework 4.0 and Visual Studio 2010. From the Entity Data Model (EDM) and Object Services to EntityClient and the Metadata Work

  4. Evolution of a multilevel framework for health program evaluation.

    Science.gov (United States)

    Masso, Malcolm; Quinsey, Karen; Fildes, Dave

    2017-07-01

    A well-conceived evaluation framework increases understanding of a program's goals and objectives, facilitates the identification of outcomes and can be used as a planning tool during program development. Herein we describe the origins and development of an evaluation framework that recognises that implementation is influenced by the setting in which it takes place, the individuals involved and the processes by which implementation is accomplished. The framework includes an evaluation hierarchy that focuses on outcomes for consumers, providers and the care delivery system, and is structured according to six domains: program delivery, impact, sustainability, capacity building, generalisability and dissemination. These components of the evaluation framework fit into a matrix structure, and cells within the matrix are supported by relevant evaluation tools. The development of the framework has been influenced by feedback from various stakeholders, existing knowledge of the evaluators and the literature on health promotion and implementation science. Over the years, the framework has matured and is generic enough to be useful in a wide variety of circumstances, yet specific enough to focus data collection, data analysis and the presentation of findings.

  5. An evaluation framework and comparative analysis of the widely used first programming languages.

    Directory of Open Access Journals (Sweden)

    Muhammad Shoaib Farooq

    Full Text Available Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL. The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  6. An evaluation framework and comparative analysis of the widely used first programming languages.

    Science.gov (United States)

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  7. Framework for an Effective Assessment and Accountability Program: The Philadelphia Example

    Science.gov (United States)

    Porter, Andrew C.; Chester, Mitchell D.; Schlesinger, Michael D.

    2004-01-01

    The purpose of this article is to put in the hands of researchers, practitioners, and policy makers a powerful framework for building and studying the effects of high-quality assessment and accountability programs. The framework is illustrated through a description and analysis of the assessment and accountability program in the School District of…

  8. Conceptual framework for a Danish human biomonitoring program

    DEFF Research Database (Denmark)

    Thomsen, Marianne; Knudsen, Lisbeth E.; Vorkamp, Katrin

    2008-01-01

    of pollution in oceans, lakes and soil as well as ground and drinking water. Human biomonitoring has only taken place in research programs and few incidences of e.g. lead contamination. However an arctic program for HBM has been in force for decades and from the preparations of the EU-pilot project on HBM......The aim of this paper is to present the conceptual framework for a Danish human biomonitoring (HBM) program. The EU and national science-policy interface, that is fundamental for a realization of the national and European environment and human health strategies, is discussed, including the need...... for the monitoring program, ii. Collection of human samples, iii. Analysis and data management and iv. Dissemination of results produced within the program. This paper presents the overall framework for data requirements and information flow in the integrated environment and health surveillance program. The added...

  9. Conceptual framework for a Danish human biomonitoring program

    Directory of Open Access Journals (Sweden)

    Fauser Patrik

    2008-01-01

    Full Text Available Abstract The aim of this paper is to present the conceptual framework for a Danish human biomonitoring (HBM program. The EU and national science-policy interface, that is fundamental for a realization of the national and European environment and human health strategies, is discussed, including the need for a structured and integrated environmental and human health surveillance program at national level. In Denmark, the initiative to implement such activities has been taken. The proposed framework of the Danish monitoring program constitutes four scientific expert groups, i.e. i. Prioritization of the strategy for the monitoring program, ii. Collection of human samples, iii. Analysis and data management and iv. Dissemination of results produced within the program. This paper presents the overall framework for data requirements and information flow in the integrated environment and health surveillance program. The added value of an HBM program, and in this respect the objectives of national and European HBM programs supporting environmental health integrated policy-decisions and human health targeted policies, are discussed. In Denmark environmental monitoring has been prioritized by extensive surveillance systems of pollution in oceans, lakes and soil as well as ground and drinking water. Human biomonitoring has only taken place in research programs and few incidences of e.g. lead contamination. However an arctic program for HBM has been in force for decades and from the preparations of the EU-pilot project on HBM increasing political interest in a Danish program has developed.

  10. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    Science.gov (United States)

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  11. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  12. A Framework for Analysis of Case Studies of Reading Lessons

    Science.gov (United States)

    Carlisle, Joanne F.; Kelcey, Ben; Rosaen, Cheryl; Phelps, Geoffrey; Vereb, Anita

    2013-01-01

    This paper focuses on the development and study of a framework to provide direction and guidance for practicing teachers in using a web-based case studies program for professional development in early reading; the program is called Case Studies Reading Lessons (CSRL). The framework directs and guides teachers' analysis of reading instruction by…

  13. Framework for Developing a Multimodal Programming Interface Used on Industrial Robots

    Directory of Open Access Journals (Sweden)

    Bogdan Mocan

    2014-12-01

    Full Text Available The proposed approach within this paper shifts the focus from the coordinate based programming of an industrial robot, which currently dominates the field, to an object based programming scheme. The general framework proposed in this paper is designed to perform natural language understanding, gesture integration and semantic analysis which facilitate the development of a multimodal robot programming interface that facilitate an intuitive programming.

  14. DEFENSE PROGRAMS RISK MANAGEMENT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Constantin PREDA

    2012-01-01

    Full Text Available For the past years defense programs have faced delays in delivering defense capabilities and budget overruns. Stakeholders are looking for ways to improve program management and the decision making process given the very fluid and uncertain economic and political environment. Consequently, they have increasingly resorted to risk management as the main management tool for achieving defense programs objectives and for delivering the defense capabilities strongly needed for the soldiers on the ground on time and within limited defense budgets. Following a risk management based decision-making approach the stakeholders are expected not only to protect program objectives against a wide range of risks but, at the same time, to take advantage of the opportunities to increase the likelihood of program success. The prerequisite for making risk management the main tool for achieving defense programs objectives is the design and implementation of a strong risk management framework as a foundation providing an efficient and effective application of the best risk management practices. The aim of this paper is to examine the risk management framework for defense programs based on the ISO 31000:2009 standard, best risk management practices and the defense programs’ needs and particularities. For the purposes of this article, the term of defense programs refers to joint defense programs.

  15. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  16. Initial Multidisciplinary Design and Analysis Framework

    Science.gov (United States)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  17. Static Analysis of Mobile Programs

    Science.gov (United States)

    2017-02-01

    and not allowed, to do. The second issue was that a fully static analysis was never a realistic possibility, because Java , the programming langauge...scale to large programs it had to handle essentially all of the features of Java and could also be used as a general-purpose analysis engine. The...static analysis of imperative languages. • A framework for adding specifications about the behavior of methods, including methods that were

  18. XACC - eXtreme-scale Accelerator Programming Framework

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  19. Public health program capacity for sustainability: a new framework.

    Science.gov (United States)

    Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C

    2013-02-01

    Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing

  20. A Framework for Collaborative Networked Learning in Higher Education: Design & Analysis

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2014-06-01

    Full Text Available This paper presents a comprehensive framework for building collaborative learning networks within higher educational institutions. This framework focuses on systems design and implementation issues in addition to a complete set of evaluation, and analysis tools. The objective of this project is to improve the standards of higher education in Jordan through the implementation of transparent, collaborative, innovative, and modern quality educational programs. The framework highlights the major steps required to plan, design, and implement collaborative learning systems. Several issues are discussed such as unification of courses and program of studies, using appropriate learning management system, software design development using Agile methodology, infrastructure design, access issues, proprietary data storage, and social network analysis (SNA techniques.

  1. 76 FR 38602 - Bovine Tuberculosis and Brucellosis; Program Framework

    Science.gov (United States)

    2011-07-01

    ...] Bovine Tuberculosis and Brucellosis; Program Framework AGENCY: Animal and Plant Health Inspection Service... framework being developed for the bovine tuberculosis and brucellosis programs in the United States. This... proposed revisions to its programs regarding bovine tuberculosis (TB) and bovine brucellosis in the United...

  2. Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs

    Science.gov (United States)

    Futch, Valerie A.

    2016-01-01

    This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…

  3. Statis Program Analysis for Reliable, Trusted Apps

    Science.gov (United States)

    2017-02-01

    and prevent errors in their Java programs. The Checker Framework includes compiler plug-ins (“checkers”) that find bugs or verify their absence. It...versions of the Java language. 4.8 DATAFLOW FRAMEWORK The dataflow framework enables more accurate analysis of source code. (Despite their similar...names, the dataflow framework is independent of the (Information) Flow Checker of chapter 2.) In Java code, a given operation may be permitted or

  4. X-framework: Space system failure analysis framework

    Science.gov (United States)

    Newman, John Steven

    Space program and space systems failures result in financial losses in the multi-hundred million dollar range every year. In addition to financial loss, space system failures may also represent the loss of opportunity, loss of critical scientific, commercial and/or national defense capabilities, as well as loss of public confidence. The need exists to improve learning and expand the scope of lessons documented and offered to the space industry project team. One of the barriers to incorporating lessons learned include the way in which space system failures are documented. Multiple classes of space system failure information are identified, ranging from "sound bite" summaries in space insurance compendia, to articles in journals, lengthy data-oriented (what happened) reports, and in some rare cases, reports that treat not only the what, but also the why. In addition there are periodically published "corporate crisis" reports, typically issued after multiple or highly visible failures that explore management roles in the failure, often within a politically oriented context. Given the general lack of consistency, it is clear that a good multi-level space system/program failure framework with analytical and predictive capability is needed. This research effort set out to develop such a model. The X-Framework (x-fw) is proposed as an innovative forensic failure analysis approach, providing a multi-level understanding of the space system failure event beginning with the proximate cause, extending to the directly related work or operational processes and upward through successive management layers. The x-fw focus is on capability and control at the process level and examines: (1) management accountability and control, (2) resource and requirement allocation, and (3) planning, analysis, and risk management at each level of management. The x-fw model provides an innovative failure analysis approach for acquiring a multi-level perspective, direct and indirect causation of

  5. VisRseq: R-based visual framework for analysis of sequencing data.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  6. TomoPy: a framework for the analysis of synchrotron tomographic data

    International Nuclear Information System (INIS)

    Gürsoy, Doǧa; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris

    2014-01-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing

  7. TomoPy: a framework for the analysis of synchrotron tomographic data

    Energy Technology Data Exchange (ETDEWEB)

    Gürsoy, Doǧa, E-mail: dgursoy@aps.anl.gov; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris [Advanced Photon Source, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439-4837 (United States)

    2014-08-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing.

  8. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    OpenAIRE

    Sitek, Paweł; Wikarek, Jarosław

    2016-01-01

    This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs) and constraint optimization problems (COPs). Two paradigms, CLP (constraint logic programming) and MP (mathematical programming), are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework a...

  9. Planning for Program Design and Assessment Using Value Creation Frameworks

    Science.gov (United States)

    Whisler, Laurel; Anderson, Rachel; Brown, Jenai

    2017-01-01

    This article explains a program design and planning process using the Value Creation Framework (VCF) developed by Wenger, Trayner, and de Laat (2011). The framework involves identifying types of value or benefit for those involved in the program, conditions and activities that support creation of that value, data that measure whether the value was…

  10. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    International Nuclear Information System (INIS)

    Hartwig, Zachary S.

    2016-01-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  11. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Hartwig, Zachary S., E-mail: hartwig@mit.edu

    2016-04-11

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  12. The US federal framework for research on endocrine disrupters and an analysis of research programs supported during fiscal year 1996

    Science.gov (United States)

    Reiter, L.W.; DeRosa, C.; Kavlock, R.J.; Lucier, G.; Mac, M.J.; Melillo, J.; Melnick, R.L.; Sinks, T.; Walton, B.T.

    1998-01-01

    The potential health and ecological effects of endocrine disrupting chemicals has become a high visibility environmental issue. The 1990s have witnessed a growing concern, both on the part of the scientific community and the public, that environmental chemicals may be causing widespread effects in humans and in a variety of fish and wildlife species. This growing concern led the Committee on the Environment and Natural Resources (CENR) of the National Science and Technology Council to identify the endocrine disrupter issue as a major research initiative in early 1995 and subsequently establish an ad hoc Working Group on Endocrine Disrupters. The objectives of the working group are to 1) develop a planning framework for federal research related to human and ecological health effects of endocrine disrupting chemicals; 2) conduct an inventory of ongoing federal research programs; and 3) identify research gaps and develop a coordinated interagency plan to address priority research needs. This communication summarizes the activities of the federal government in defining a common framework for planning an endocrine disrupter research program and in assessing the status of the current effort. After developing the research framework and compiling an inventory of active research projects supported by the federal government in fiscal year 1996, the CENR working group evaluated the current federal effort by comparing the ongoing activities with the research needs identified in the framework. The analysis showed that the federal government supports considerable research on human health effects, ecological effects, and exposure assessment, with a predominance of activity occurring under human health effects. The analysis also indicates that studies on reproductive development and carcinogenesis are more prevalent than studies on neurotoxicity and immunotoxicity, that mammals (mostly laboratory animals) are the main species under study, and that chlorinated dibenzodioxins and

  13. VisRseq: R-based visual framework for analysis of sequencing data

    OpenAIRE

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven JM

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for ...

  14. A framework for monitoring social process and outcomes in environmental programs.

    Science.gov (United States)

    Chapman, Sarah

    2014-12-01

    When environmental programs frame their activities as being in the service of human wellbeing, social variables need to be integrated into monitoring and evaluation (M&E) frameworks. This article draws upon ecosystem services theory to develop a framework to guide the M&E of collaborative environmental programs with anticipated social benefits. The framework has six components: program need, program activities, pathway process variables, moderating process variables, outcomes, and program value. Needs are defined in terms of ecosystem services, as well as other human needs that must be addressed to achieve outcomes. The pathway variable relates to the development of natural resource governance capacity in the target community. Moderating processes can be externalities such as the inherent capacity of the natural system to service ecosystem needs, local demand for natural resources, policy or socio-economic drivers. Internal program-specific processes relate to program service delivery, targeting and participant responsiveness. Ecological outcomes are expressed in terms of changes in landscape structure and function, which in turn influence ecosystem service provision. Social benefits derived from the program are expressed in terms of the value of the eco-social service to user-specified goals. The article provides suggestions from the literature for identifying indicators and measures for components and component variables, and concludes with an example of how the framework was used to inform the M&E of an adaptive co-management program in western Kenya. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Systems theory as a framework for examining a college campus-based support program for the former foster youth.

    Science.gov (United States)

    Schelbe, Lisa; Randolph, Karen A; Yelick, Anna; Cheatham, Leah P; Groton, Danielle B

    2018-01-01

    Increased attention to former foster youth pursuing post-secondary education has resulted in the creation of college campus based support programs to address their need. However, limited empirical evidence and theoretical knowledge exist about these programs. This study seeks to describe the application of systems theory as a framework for examining a college campus based support program for former foster youth. In-depth semi-structured interviews were conducted with 32 program stakeholders including students, mentors, collaborative members, and independent living program staff. Using qualitative data analysis software, holistic coding techniques were employed to analyze interview transcripts. Then applying principles of extended case method using systems theory, data were analyzed. Findings suggest systems theory serves as a framework for understanding the functioning of a college campus based support program. The theory's concepts help delineate program components and roles of stakeholders; outline boundaries between and interactions among stakeholders; and identify program strengths and weakness. Systems theory plays an important role in identifying intervention components and providing a structure through which to identify and understand program elements as a part of the planning process. This study highlights the utility of systems theory as a framework for program planning and evaluation.

  16. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  17. A Typology Framework of Loyalty Reward Programs

    Science.gov (United States)

    Cao, Yuheng; Nsakanda, Aaron Luntala; Mann, Inder Jit Singh

    Loyalty reward programs (LRPs), initially developed as marketing programs to enhance customer retention, have now become an important part of customer-focused business strategy. With the proliferation and increasing economy impact of the programs, the management complexity in the programs has also increased. However, despite widespread adoption of LRPs in business, academic research in the field seems to lag behind its practical application. Even the fundamental questions such as what LRPs are and how to classify them have not yet been fully addressed. In this paper, a comprehensive framework for LRP classification is proposed, which provides a foundation for further study of LRP design and planning issues.

  18. The 7 th framework program of the EU

    International Nuclear Information System (INIS)

    Gonzalez, E. M.; Serrano, J. A.

    2007-01-01

    The framework Program is the principal community initiative for fostering and supporting R and D in the European Union. its main goal is to improve competitiveness by fundamentally financing research, technological development, demonstration and innovation activities through transnational collaboration between research institutes and firms belong to both the European Union countries and States affiliated as third countries. In addition, it provides financial support to enhancement and coordination of European research infrastructures, promotion and training of research personnel, basic research and, particularly as of the current 7th Framework Program, coordination of national R and D programs and impllementation of European technology platforms (PTEs), which have been conveived to promote strategic research agendas in key sectors with the cooperation of all the involved players. In the wake of the PTEs, different national platforms have been implemented at the national level which are very active in different sectors. (Authors)

  19. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  20. A flexible framework for secure and efficient program obfuscation.

    Energy Technology Data Exchange (ETDEWEB)

    Solis, John Hector

    2013-03-01

    In this paper, we present a modular framework for constructing a secure and efficient program obfuscation scheme. Our approach, inspired by the obfuscation with respect to oracle machines model of [4], retains an interactive online protocol with an oracle, but relaxes the original computational and storage restrictions. We argue this is reasonable given the computational resources of modern personal devices. Furthermore, we relax the information-theoretic security requirement for computational security to utilize established cryptographic primitives. With this additional flexibility we are free to explore different cryptographic buildingblocks. Our approach combines authenticated encryption with private information retrieval to construct a secure program obfuscation framework. We give a formal specification of our framework, based on desired functionality and security properties, and provide an example instantiation. In particular, we implement AES in Galois/Counter Mode for authenticated encryption and the Gentry-Ramzan [13]constant communication-rate private information retrieval scheme. We present our implementation results and show that non-trivial sized programs can be realized, but scalability is quickly limited by computational overhead. Finally, we include a discussion on security considerations when instantiating specific modules.

  1. A quality framework for addiction treatment programs

    NARCIS (Netherlands)

    Nabitz, Udo; van den Brink, Wim; Walburg, Jan

    2005-01-01

    AIM: To identify and specify the structure and the elements of a quality framework for addiction treatment programs. METHOD: Concept mapping strategy was applied. In brainstorm sessions, 70 statements were generated and rated by 90 representatives of three stakeholder groups. Using multivariate

  2. Event Reconstruction and Analysis in the R3BRoot Framework

    International Nuclear Information System (INIS)

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  3. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  4. A Decision Support Framework for Feasibility Analysis of International Space Station (ISS) Research Capability Enhancing Options

    Science.gov (United States)

    Ortiz, James N.; Scott,Kelly; Smith, Harold

    2004-01-01

    The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.

  5. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  6. Mississippi Curriculum Framework for Welding and Cutting Programs (Program CIP: 48.0508--Welder/Welding Technologist). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the welding and cutting programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…

  7. A Comparative Analysis of Competency Frameworks for Youth Workers in the Out-of-School Time Field

    OpenAIRE

    Vance, Femi

    2010-01-01

    Research suggests that the quality of out-of-school time (OST) programs is related to positive youth outcomes and skilled staff are a critical component of high quality programming. This descriptive case study of competency frameworks for youth workers in the OST field demonstrates how experts and practitioners characterize a skilled youth worker. A comparative analysis of 11 competency frameworks is conducted to identify a set of common core competencies. A set of 12 competency areas that ar...

  8. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    Science.gov (United States)

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  9. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  10. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    Directory of Open Access Journals (Sweden)

    Paweł Sitek

    2016-01-01

    Full Text Available This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs and constraint optimization problems (COPs. Two paradigms, CLP (constraint logic programming and MP (mathematical programming, are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework automatically generates CSP and COP models based on current values of data instances, questions asked by a user, and set of predicates and facts of the problem being modeled, which altogether constitute a knowledge database for the given problem. This dynamic generation of dedicated models, based on the knowledge base, together with the parameters changing externally, for example, the user’s questions, is the implementation of the autonomous search concept. The models are solved using the internal or external solvers integrated with the framework. The architecture of the framework as well as its implementation outline is also included in the paper. The effectiveness of the framework regarding the modeling and solution search is assessed through the illustrative examples relating to scheduling problems with additional constrained resources.

  11. A Simulation Modeling Framework to Optimize Programs Using Financial Incentives to Motivate Health Behavior Change.

    Science.gov (United States)

    Basu, Sanjay; Kiernan, Michaela

    2016-01-01

    While increasingly popular among mid- to large-size employers, using financial incentives to induce health behavior change among employees has been controversial, in part due to poor quality and generalizability of studies to date. Thus, fundamental questions have been left unanswered: To generate positive economic returns on investment, what level of incentive should be offered for any given type of incentive program and among which employees? We constructed a novel modeling framework that systematically identifies how to optimize marginal return on investment from programs incentivizing behavior change by integrating commonly collected data on health behaviors and associated costs. We integrated "demand curves" capturing individual differences in response to any given incentive with employee demographic and risk factor data. We also estimated the degree of self-selection that could be tolerated: that is, the maximum percentage of already-healthy employees who could enroll in a wellness program while still maintaining positive absolute return on investment. In a demonstration analysis, the modeling framework was applied to data from 3000 worksite physical activity programs across the nation. For physical activity programs, the incentive levels that would optimize marginal return on investment ($367/employee/year) were higher than average incentive levels currently offered ($143/employee/year). Yet a high degree of self-selection could undermine the economic benefits of the program; if more than 17% of participants came from the top 10% of the physical activity distribution, the cost of the program would be expected to always be greater than its benefits. Our generalizable framework integrates individual differences in behavior and risk to systematically estimate the incentive level that optimizes marginal return on investment. © The Author(s) 2015.

  12. Learner Analysis Framework for Globalized E-Learning: A Case Study

    Directory of Open Access Journals (Sweden)

    Mamta Saxena

    2011-06-01

    Full Text Available The shift to technology-mediated modes of instructional delivery and increased global connectivity has led to a rise in globalized e-learning programs. Educational institutions face multiple challenges as they seek to design effective, engaging, and culturally competent instruction for an increasingly diverse learner population. The purpose of this study was to explore strategies for expanding learner analysis within the instructional design process to better address cultural influences on learning. A case study approach leveraged the experience of practicing instructional designers to build a framework for culturally competent learner analysis.The study discussed the related challenges and recommended strategies to improve the effectiveness of cross-cultural learner analysis. Based on the findings, a framework for conducting cross-cultural learner analysis to guide the cultural analysis of diverse learners was proposed. The study identified the most critical factors in improving cross-cultural learner analysis as the judicious use of existing research on cross-cultural theories and joint deliberation on the part of all the participants from the management to the learners. Several strategies for guiding and improving the cultural inquiry process were summarized. Barriers and solutions for the requirements are also discussed.

  13. Translating policies into practice: a framework to prevent childhood obesity in afterschool programs.

    Science.gov (United States)

    Beets, Michael W; Webster, Collin; Saunders, Ruth; Huberty, Jennifer L

    2013-03-01

    Afterschool programs (3-6 p.m.) are positioned to play a critical role in combating childhood obesity. To this end, state and national organizations have developed policies related to promoting physical activity and guiding the nutritional quality of snacks served in afterschool programs. No conceptual frameworks, however, are available that describe the process of how afterschool programs will translate such policies into daily practice to reach eventual outcomes. Drawing from complex systems theory, this article describes the development of a framework that identifies critical modifiable levers within afterschool programs that can be altered and/or strengthened to reach policy goals. These include the policy environment at the national, state, and local levels; individual site, afterschool program leader, staff, and child characteristics; and existing outside organizational partnerships. Use of this framework and recognition of its constituent elements have the potential to lead to the successful and sustainable adoption and implementation of physical activity and nutrition policies in afterschool programs nationwide.

  14. COMP Superscalar, an interoperable programming framework

    Science.gov (United States)

    Badia, Rosa M.; Conejero, Javier; Diaz, Carlos; Ejarque, Jorge; Lezzi, Daniele; Lordan, Francesc; Ramon-Cortes, Cristian; Sirvent, Raul

    2015-12-01

    COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i) identifying the functions to be executed as asynchronous parallel tasks and (ii) annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  15. COMP Superscalar, an interoperable programming framework

    Directory of Open Access Journals (Sweden)

    Rosa M. Badia

    2015-12-01

    Full Text Available COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i identifying the functions to be executed as asynchronous parallel tasks and (ii annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  16. Multi-Year Program under Budget Constraints Using Multi-Criteria Analysis

    Directory of Open Access Journals (Sweden)

    Surya Adiguna

    2017-09-01

    Full Text Available Road investment appraisal requires joint consideration of multiple criteria which are related to engineering, economic, social and environmental impacts. The investment consideration could be based on the economic analysis but however for some factors, such as environmental, social, and political, are difficult to quantify in monetary term. The multi-criteria analysis is the alternative tool which caters the requirements of the issues above. The research, which is based on 102 class D and class E paved road sections in Kenya, is about to optimize road network investment under budget constraints by applying a multi-criteria analysis (MCA method and compare it with the conventional economic analysis. The MCA is developed from hierarchy structure which is considered as the analytical framework. The framework is based on selected criteria and weights which are assigned from Kenya road policy. The HDM-4 software is applied as decision-making tool to obtain the best investment alternatives and road work programs from both MCA and economic analysis. The road work programs will be the results from the analysis using both MCA and economic analysis within HDM-4 software to see the difference and compare the results between both programs. The results from MCA show 51 road sections need periodic work, which is overlay or resealing. Meanwhile, 51 others need rehabilitation or reconstruction. The five years road work program which based on economic analysis result shows that it costs almost Kenyan Shilling (KES 130 billion to maintain the class D and E paved road in Kenya. Meanwhile, the MCA only requires KES 59.5 billion for 5 years program. These results show huge margin between two analyses and somehow MCA result provides more efficient work program compared to economic analysis.

  17. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    Science.gov (United States)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  18. The Data-to-Action Framework: A Rapid Program Improvement Process

    Science.gov (United States)

    Zakocs, Ronda; Hill, Jessica A.; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E.

    2015-01-01

    Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to…

  19. Using Framework Analysis in nursing research: a worked example.

    Science.gov (United States)

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  20. Mississippi Curriculum Framework for Banking & Finance Technology (Program CIP: 52.0803--Banking and Related Financial Programs, Other). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the banking and finance technology program. Presented in the introduction are a program description and suggested course sequence. Section I is a curriculum guide consisting of outlines for…

  1. Choosing your IoT programming framework : architectural aspects

    NARCIS (Netherlands)

    Rahman, L.F.; Ozcelebi, T.; Lukkien, J.J.

    2016-01-01

    The Internet of Things (IoT) is turning into practice. To drive innovations, it is crucial that programmers have means to develop IoT applications in the form of IoT programming frameworks. These are toolkits to develop applications according to a certain style or method and that let developers

  2. A PROOF Analysis Framework

    International Nuclear Information System (INIS)

    González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.

  3. On the non-proliferation framework of Japan's peaceful nuclear utilization program

    International Nuclear Information System (INIS)

    Kano, Takashi

    1996-01-01

    The Conference of the States Party to the Treaty on the Non-proliferation of Nuclear Weapons (hereinafter referred to as the NPT) convened in New York, from April 17 to May 12, 1995 and decided that the NPT shall continue in force indefinitely, after reviewing the operation and affirming some aspects of the NPT, while emphasizing the ''Decision on Strengthening the Review Process'' for the NPT and the ''Decision on Principles and Objectives for Nuclear Non-proliferation and Disarmament,'' also adopted by the Conference. In parallel, Japan made its basic non-proliferation policy clear in the ''Long-Term Program for Research, Development and Utilization of Nuclear Energy'' which was decided by the Atomic Energy Commission (chaired by Mikio Oomi, then Minister of the Science and Technology Agency of Japan) in June 1994. The Long-Term Program discusses various problems facing post-Cold-War international society and describes Japan's policy for establishing international confidence concerning non-proliferation. This paper summarizes Japan's non-proliferation policy as articulated in the Long-Term Program, and describes some results of an analysis comparing the Long-Term Program with the resolutions on the international non-proliferation frameworks adopted by the NPT conference

  4. A framework for evaluating and designing citizen science programs for natural resources monitoring.

    Science.gov (United States)

    Chase, Sarah K; Levine, Arielle

    2016-06-01

    We present a framework of resource characteristics critical to the design and assessment of citizen science programs that monitor natural resources. To develop the framework we reviewed 52 citizen science programs that monitored a wide range of resources and provided insights into what resource characteristics are most conducive to developing citizen science programs and how resource characteristics may constrain the use or growth of these programs. We focused on 4 types of resource characteristics: biophysical and geographical, management and monitoring, public awareness and knowledge, and social and cultural characteristics. We applied the framework to 2 programs, the Tucson (U.S.A.) Bird Count and the Maui (U.S.A.) Great Whale Count. We found that resource characteristics such as accessibility, diverse institutional involvement in resource management, and social or cultural importance of the resource affected program endurance and success. However, the relative influence of each characteristic was in turn affected by goals of the citizen science programs. Although the goals of public engagement and education sometimes complimented the goal of collecting reliable data, in many cases trade-offs must be made between these 2 goals. Program goals and priorities ultimately dictate the design of citizen science programs, but for a program to endure and successfully meet its goals, program managers must consider the diverse ways that the nature of the resource being monitored influences public participation in monitoring. © 2016 Society for Conservation Biology.

  5. Mississippi Curriculum Framework for Veterinary Technology (Program CIP: 51.0808--Veterinarian Asst./Animal Health). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the veterinary technology program. Presented in the introductory section are a of the program and suggested course sequence. Section I lists baseline competencies, and section II consists of…

  6. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  7. Replicating MISTERS: an epidemiological criminology framework analysis of a program for criminal justice-involved minority males in the community.

    Science.gov (United States)

    Potter, Roberto Hugh; Akers, Timothy A; Bowman, Daniel Richard

    2013-01-01

    The Men in STD Training and Empowerment Research Study (MISTERS) program and epidemiological criminology began their development in Atlanta at about the same time. MISTERS focuses on men recently released from jail to reduce both HIV/STD and crime-related risk factors through a brief educational intervention. This article examines ways in which MISTERS and epidemiological criminology have been used to inform one another in the replication of the MISTERS program in Orange County, Florida. Data from 110 MISTERS participants during the first 10 months of operation are analyzed to examine the overlapping occurrence of health and criminal risk behaviors in the men's lives. This provides a test of core hypotheses from the epidemiological criminology framework. This article also examines application of the epidemiological criminology framework to develop interventions to address health and crime risk factors simultaneously in Criminal Justice-Involved populations in the community.

  8. The SBIRT program matrix: a conceptual framework for program implementation and evaluation.

    Science.gov (United States)

    Del Boca, Frances K; McRee, Bonnie; Vendetti, Janice; Damon, Donna

    2017-02-01

    Screening, Brief Intervention and Referral to Treatment (SBIRT) is a comprehensive, integrated, public health approach to the delivery of services to those at risk for the adverse consequences of alcohol and other drug use, and for those with probable substance use disorders. Research on successful SBIRT implementation has lagged behind studies of efficacy and effectiveness. This paper (1) outlines a conceptual framework, the SBIRT Program Matrix, to guide implementation research and program evaluation and (2) specifies potential implementation outcomes. Overview and narrative description of the SBIRT Program Matrix. The SBIRT Program Matrix has five components, each of which includes multiple elements: SBIRT services; performance sites; provider attributes; patient/client populations; and management structure and activities. Implementation outcomes include program adoption, acceptability, appropriateness, feasibility, fidelity, costs, penetration, sustainability, service provision and grant compliance. The Screening, Brief Intervention and Referral to Treatment Program Matrix provides a template for identifying, classifying and organizing the naturally occurring commonalities and variations within and across SBIRT programs, and for investigating which variables are associated with implementation success and, ultimately, with treatment outcomes and other impacts. © 2017 Society for the Study of Addiction.

  9. CLARA: CLAS12 Reconstruction and Analysis Framework

    Energy Technology Data Exchange (ETDEWEB)

    Gyurjyan, Vardan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Matta, Sebastian Mancilla [Santa Maria U., Valparaiso, Chile; Oyarzun, Ricardo [Santa Maria U., Valparaiso, Chile

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  10. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  11. Luiza: Analysis Framework for GLORIA

    Directory of Open Access Journals (Sweden)

    Aleksander Filip Żarnecki

    2013-01-01

    Full Text Available The Luiza analysis framework for GLORIA is based on the Marlin package, which was originally developed for data analysis in the new High Energy Physics (HEP project, International Linear Collider (ILC. The HEP experiments have to deal with enormous amounts of data and distributed data analysis is therefore essential. The Marlin framework concept seems to be well suited for the needs of GLORIA. The idea (and large parts of the code taken from Marlin is that every computing task is implemented as a processor (module that analyzes the data stored in an internal data structure, and the additional output is also added to that collection. The advantage of this modular approach is that it keeps things as simple as possible. Each step of the full analysis chain, e.g. from raw images to light curves, can be processed step-by-step, and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  12. Analysis of higher education policy frameworks for open and distance education in Pakistan.

    Science.gov (United States)

    Ellahi, Abida; Zaka, Bilal

    2015-04-01

    The constant rise in demand for higher education has become the biggest challenge for educational planners. This high demand has paved a way for distance education across the globe. This article innovatively analyzes the policy documentation of a major distance education initiative in Pakistan for validity that will identify the utility of policy linkages. The study adopted a qualitative research design that consisted of two steps. In the first step, a content analysis of distance learning policy framework was made. For this purpose, two documents were accessed titled "Framework for Launching Distance Learning Programs in HEIs of Pakistan" and "Guideline on Quality of Distance Education for External Students at the HEIs of Pakistan." In the second step, the policy guidelines mentioned in these two documents were evaluated at two levels. At the first level, the overall policy documents were assessed against a criterion proposed by Cheung, Mirzaei, and Leeder. At the second level, the proposed program of distance learning was assessed against a criterion set by Gellman-Danley and Fetzner and Berge. The distance education program initiative in Pakistan is of promising nature which needs to be assessed regularly. This study has made an initial attempt to assess the policy document against a criterion identified from literature. The analysis shows that the current policy documents do offer some strengths at this initial level, however, they cannot be considered a comprehensive policy guide. The inclusion or correction of missing or vague areas identified in this study would make this policy guideline document a treasured tool for Higher Education Commission (HEC). For distance education policy makers, this distance education policy framework model recognizes several fundamental areas with which they should be concerned. The findings of this study in the light of two different policy framework measures highlight certain opportunities that can help strengthening the

  13. Aura: A Multi-Featured Programming Framework in Python

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available This paper puts forward the design, programming and application of innovative educational software, ‘Aura’ made using Python and PyQt Python bindings. The research paper presents a new concept of using a single tool to relate between syntaxes of various programming languages and algorithms. It radically increases their understanding and retaining capacity, since they can correlate between many programming languages. The software is a totally unorthodox attempt towards helping students who have their first tryst with programming languages. The application is designed to help students understand how algorithms work and thus, help them in learning multiple programming languages on a single platform using an interactive graphical user interface. This paper elucidates how using Python and PyQt bindings, a comprehensive feature rich application, that implements an interactive algorithm building technique, a web browser, multiple programming language framework, a code generator and a real time code sharing hub be embedded into a single interface. And also explains, that using Python as building tool, it requires much less coding than conventional feature rich applications coded in other programming languages, and at the same time does not compromise on stability, inter-operability and robustness of the application.

  14. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    International Nuclear Information System (INIS)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.; Wecksung, M.J.; Willcutt, G.J.E. Jr.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework

  15. Constraint Solver Techniques for Implementing Precise and Scalable Static Program Analysis

    DEFF Research Database (Denmark)

    Zhang, Ye

    solver using unification we could make a program analysis easier to design and implement, much more scalable, and still as precise as expected. We present an inclusion constraint language with the explicit equality constructs for specifying program analysis problems, and a parameterized framework...... developers to build reliable software systems more quickly and with fewer bugs or security defects. While designing and implementing a program analysis remains a hard work, making it both scalable and precise is even more challenging. In this dissertation, we show that with a general inclusion constraint...... data flow analyses for C language, we demonstrate a large amount of equivalences could be detected by off-line analyses, and they could then be used by a constraint solver to significantly improve the scalability of an analysis without sacrificing any precision....

  16. Detecting spatial patterns of rivermouth processes using a geostatistical framework for near-real-time analysis

    Science.gov (United States)

    Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara

    2017-01-01

    This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.

  17. Pembuatan Kakas Pendeteksi Unused Method pada Kode Program PHP dengan Framework CodeIgniter Menggunakan Call Graph

    Directory of Open Access Journals (Sweden)

    Divi Galih Prasetyo Putri

    2014-03-01

    Full Text Available Proses evolusi dan perawatan dari sebuah sistem merupakan proses yang sangat penting dalam rekayasa perangkat lunak tidak terkecuali pada aplikasi web. Pada proses ini kebanyakan pengembang tidak lagi berpatokan pada rancangan sistem. Hal ini menyebabkan munculnya unused method. Bagian-bagian program ini tidak lagi terpakai namun masih berada dalam sistem. Keadaan ini meningkatkan kompleksitas dan mengurangi tingkat understandability sistem. Guna mendeteksi adanya unused method pada progam diperlukan teknik untuk melakukan code analysis. Teknik static analysis yang digunakan memanfaatkan call graph yang dibangun dari kode program untuk mengetahui adanya unused method. Call graph dibangun berdasarkan pemanggilan antar method. Aplikasi ini mendeteksi unused method pada kode program PHP yang dibangun menggunakan framework CodeIgniter. Kode program sebagai inputan diurai kedalam bentuk Abstract Syntax Tree (AST yang kemudian dimanfaatkan untuk melakukan analisis terhadap kode program. Proses analisis tersebut kemudian menghasilkan sebuah call graph. Dari call graph yang dihasilkan dapat dideteksi method-method mana saja yang tidak berhasil ditelusuri dan tergolong kedalam unused method. Kakas telah diuji coba pada 5 aplikasi PHP dengan hasil  rata-rata nilai presisi sistem sebesar 0.749 dan recall sebesar 1.

  18. Mississippi Curriculum Framework for Drafting and Design Technology (Program CIP: 48.0102--Architectural Drafting Technology) (Program CIP: 48.0101--General Drafting). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the two course sequences of the state's postsecondary-level drafting and design technology program: architectural drafting technology and drafting and design technology. Presented first are a program description and…

  19. The chronic care model versus disease management programs: a transaction cost analysis approach.

    Science.gov (United States)

    Leeman, Jennifer; Mark, Barbara

    2006-01-01

    The present article applies transaction cost analysis as a framework for better understanding health plans' decisions to improve chronic illness management by using disease management programs versus redesigning care within physician practices.

  20. Streamlining Policy Creation in Policy Frameworks

    NARCIS (Netherlands)

    M.A. Hills (Mark); N. Martí-Oliet; M. Palomino

    2012-01-01

    textabstract{\\it Policy frameworks} provide a technique for improving reuse in program analysis: the same language frontend, and a core analysis semantics, can be shared among multiple analysis policies for the same language, while analysis domains (such as units of measurement) can be shared among

  1. Design and Analysis of Web Application Frameworks

    DEFF Research Database (Denmark)

    Schwarz, Mathias Romme

    -state manipulation vulnerabilities. The hypothesis of this dissertation is that we can design frameworks and static analyses that aid the programmer to avoid such errors. First, we present the JWIG web application framework for writing secure and maintainable web applications. We discuss how this framework solves...... some of the common errors through an API that is designed to be safe by default. Second, we present a novel technique for checking HTML validity for output that is generated by web applications. Through string analysis, we approximate the output of web applications as context-free grammars. We model......Numerous web application frameworks have been developed in recent years. These frameworks enable programmers to reuse common components and to avoid typical pitfalls in web application development. Although such frameworks help the programmer to avoid many common errors, we nd...

  2. Building Campus Communities Inclusive of International Students: A Framework for Program Development

    Science.gov (United States)

    Jameson, Helen Park; Goshit, Sunday

    2017-01-01

    This chapter provides readers with a practical, how-to approach and framework for developing inclusive, intercultural training programs for student affairs professionals on college campuses in the United States.

  3. MOOC Success Factors: Proposal of an Analysis Framework

    Directory of Open Access Journals (Sweden)

    Margarida M. Marques

    2017-10-01

    Full Text Available Aim/Purpose: From an idea of lifelong-learning-for-all to a phenomenon affecting higher education, Massive Open Online Courses (MOOCs can be the next step to a truly universal education. Indeed, MOOC enrolment rates can be astoundingly high; still, their completion rates are frequently disappointingly low. Nevertheless, as courses, the participants’ enrolment and learning within the MOOCs must be considered when assessing their success. In this paper, the authors’ aim is to reflect on what makes a MOOC successful to propose an analysis framework of MOOC success factors. Background: A literature review was conducted to identify reported MOOC success factors and to propose an analysis framework. Methodology: This literature-based framework was tested against data of a specific MOOC and refined, within a qualitative interpretivist methodology. The data were collected from the ‘As alterações climáticas nos média escolares - Clima@EduMedia’ course, which was developed by the project Clima@EduMedia and was submitted to content analysis. This MOOC aimed to support science and school media teachers in the use of media to teach climate change Contribution: By proposing a MOOC success factors framework the authors are attempting to contribute to fill in a literature gap regarding what concerns criteria to consider a specific MOOC successful. Findings: This work major finding is a literature-based and empirically-refined MOOC success factors analysis framework. Recommendations for Practitioners: The proposed framework is also a set of best practices relevant to MOOC developers, particularly when targeting teachers as potential participants. Recommendation for Researchers: This work’s relevance is also based on its contribution to increasing empirical research on MOOCs. Impact on Society: By providing a proposal of a framework on factors to make a MOOC successful, the authors hope to contribute to the quality of MOOCs. Future Research: Future

  4. Mississippi Curriculum Framework for Fashion Marketing Technology (Program CIP: 08.0101--Apparel and Accessories Mkt. Op., Gen.). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the fashion marketing technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…

  5. Analysis framework for GLORIA

    Science.gov (United States)

    Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian

    2012-05-01

    GLORIA stands for “GLObal Robotic-telescopes Intelligent Array”. GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  6. Developing an evaluation framework for clinical redesign programs: lessons learnt.

    Science.gov (United States)

    Samaranayake, Premaratne; Dadich, Ann; Fitzgerald, Anneke; Zeitz, Kathryn

    2016-09-19

    Purpose The purpose of this paper is to present lessons learnt through the development of an evaluation framework for a clinical redesign programme - the aim of which was to improve the patient journey through improved discharge practices within an Australian public hospital. Design/methodology/approach The development of the evaluation framework involved three stages - namely, the analysis of secondary data relating to the discharge planning pathway; the analysis of primary data including field-notes and interview transcripts on hospital processes; and the triangulation of these data sets to devise the framework. The evaluation framework ensured that resource use, process management, patient satisfaction, and staff well-being and productivity were each connected with measures, targets, and the aim of clinical redesign programme. Findings The application of business process management and a balanced scorecard enabled a different way of framing the evaluation, ensuring measurable outcomes were connected to inputs and outputs. Lessons learnt include: first, the importance of mixed-methods research to devise the framework and evaluate the redesigned processes; second, the need for appropriate tools and resources to adequately capture change across the different domains of the redesign programme; and third, the value of developing and applying an evaluative framework progressively. Research limitations/implications The evaluation framework is limited by its retrospective application to a clinical process redesign programme. Originality/value This research supports benchmarking with national and international practices in relation to best practice healthcare redesign processes. Additionally, it provides a theoretical contribution on evaluating health services improvement and redesign initiatives.

  7. The Use of the Data-to-Action Framework in the Evaluation of CDC's DELTA FOCUS Program.

    Science.gov (United States)

    Armstead, Theresa L; Kearns, Megan; Rambo, Kirsten; Estefan, Lianne Fuino; Dills, Jenny; Rivera, Moira S; El-Beshti, Rasha

    The Centers for Disease Control and Prevention's (CDC's) Domestic Violence Prevention Enhancements and Leadership Through Alliances, Focusing on Outcomes for Communities United with States (DELTA FOCUS) program is a 5-year cooperative agreement (2013-2018) funding 10 state domestic violence coalitions and local coordinated community response teams to engage in primary prevention of intimate partner violence. Grantees' prevention strategies were often developmental and emergent; therefore, CDC's approach to program oversight, administration, and support to grantees required a flexible approach. CDC staff adopted a Data-to-Action Framework for the DELTA FOCUS program evaluation that supported a culture of learning to meet dynamic and unexpected information needs. Briefly, a Data-to-Action Framework involves the collection and use of information in real time for program improvement. Utilizing this framework, the DELTA FOCUS data-to-action process yielded important insights into CDC's ongoing technical assistance, improved program accountability by providing useful materials, and information for internal agency leadership, and helped build a learning community among grantees. CDC and other funders, as decision makers, can promote program improvements that are data-informed by incorporating internal processes supportive of ongoing data collection and review.

  8. Introduction of blended learning in a master program: Developing an integrative mixed method evaluation framework.

    Science.gov (United States)

    Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K

    2017-01-01

    The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Can programming frameworks bring smartphones into the mainstream of psychological science?

    Directory of Open Access Journals (Sweden)

    Lukasz Piwek

    2016-08-01

    Full Text Available Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key barriers that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in light of ResearchKit and other recent methodological developments. We conclude that while these programming frameworks are certainly a step in the right direction it remains challenging to create usable research-orientated applications with current frameworks. Smartphones may only become an asset for psychology and social science as a whole when development software that is both easy to use, secure, and becomes freely available.

  10. Can Programming Frameworks Bring Smartphones into the Mainstream of Psychological Science?

    Science.gov (United States)

    Piwek, Lukasz; Ellis, David A

    2016-01-01

    Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key issues that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in light of ResearchKit and other recent methodological developments. We conclude that while these programming frameworks are certainly a step in the right direction it remains challenging to create usable research-orientated applications with current frameworks. Smartphones may only become an asset for psychology and social science as a whole when development software that is both easy to use and secure becomes freely available.

  11. Debugging Nondeterministic Failures in Linux Programs through Replay Analysis

    Directory of Open Access Journals (Sweden)

    Shakaiba Majeed

    2018-01-01

    Full Text Available Reproducing a failure is the first and most important step in debugging because it enables us to understand the failure and track down its source. However, many programs are susceptible to nondeterministic failures that are hard to reproduce, which makes debugging extremely difficult. We first address the reproducibility problem by proposing an OS-level replay system for a uniprocessor environment that can capture and replay nondeterministic events needed to reproduce a failure in Linux interactive and event-based programs. We then present an analysis method, called replay analysis, based on the proposed record and replay system to diagnose concurrency bugs in such programs. The replay analysis method uses a combination of static analysis, dynamic tracing during replay, and delta debugging to identify failure-inducing memory access patterns that lead to concurrency failure. The experimental results show that the presented record and replay system has low-recording overhead and hence can be safely used in production systems to catch rarely occurring bugs. We also present few concurrency bug case studies from real-world applications to prove the effectiveness of the proposed bug diagnosis framework.

  12. Establishing a framework for comparative analysis of genome sequences

    Energy Technology Data Exchange (ETDEWEB)

    Bansal, A.K.

    1995-06-01

    This paper describes a framework and a high-level language toolkit for comparative analysis of genome sequence alignment The framework integrates the information derived from multiple sequence alignment and phylogenetic tree (hypothetical tree of evolution) to derive new properties about sequences. Multiple sequence alignments are treated as an abstract data type. Abstract operations have been described to manipulate a multiple sequence alignment and to derive mutation related information from a phylogenetic tree by superimposing parsimonious analysis. The framework has been applied on protein alignments to derive constrained columns (in a multiple sequence alignment) that exhibit evolutionary pressure to preserve a common property in a column despite mutation. A Prolog toolkit based on the framework has been implemented and demonstrated on alignments containing 3000 sequences and 3904 columns.

  13. MetaJC++: A flexible and automatic program transformation technique using meta framework

    Science.gov (United States)

    Beevi, Nadera S.; Reghu, M.; Chitraprasad, D.; Vinodchandra, S. S.

    2014-09-01

    Compiler is a tool to translate abstract code containing natural language terms to machine code. Meta compilers are available to compile more than one languages. We have developed a meta framework intends to combine two dissimilar programming languages, namely C++ and Java to provide a flexible object oriented programming platform for the user. Suitable constructs from both the languages have been combined, thereby forming a new and stronger Meta-Language. The framework is developed using the compiler writing tools, Flex and Yacc to design the front end of the compiler. The lexer and parser have been developed to accommodate the complete keyword set and syntax set of both the languages. Two intermediate representations have been used in between the translation of the source program to machine code. Abstract Syntax Tree has been used as a high level intermediate representation that preserves the hierarchical properties of the source program. A new machine-independent stack-based byte-code has also been devised to act as a low level intermediate representation. The byte-code is essentially organised into an output class file that can be used to produce an interpreted output. The results especially in the spheres of providing C++ concepts in Java have given an insight regarding the potential strong features of the resultant meta-language.

  14. Overview of the NRC/EPRI common cause analysis framework

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Worledge, D.H.; Mosleh, A.; Fleming, K.; Parry, G.W.; Paula, H.

    1988-01-01

    This paper presents an overview of a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures whose causes are not explicitly included in the logic model as basic events. The emphasis here is on providing guidelines for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework comprises four major stages: (1) Logic Model Development, (2) Identification of Common Cause Component Groups, (3) Common Cause Modeling and Data Analysis, and (4) Quantification and Interpretation of Results. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. 25 references

  15. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  16. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  17. Along the way to developing a theory of the program: a re-examination of the conceptual framework as an organizing strategy.

    Science.gov (United States)

    Helitzer, Deborah L; Sussman, Andrew L; Hoffman, Richard M; Getrich, Christina M; Warner, Teddy D; Rhyne, Robert L

    2014-08-01

    Conceptual frameworks (CF) have historically been used to develop program theory. We re-examine the literature about the role of CF in this context, specifically how they can be used to create descriptive and prescriptive theories, as building blocks for a program theory. Using a case example of colorectal cancer screening intervention development, we describe the process of developing our initial CF, the methods used to explore the constructs in the framework and revise the framework for intervention development. We present seven steps that guided the development of our CF: (1) assemble the "right" research team, (2) incorporate existing literature into the emerging CF, (3) construct the conceptual framework, (4) diagram the framework, (5) operationalize the framework: develop the research design and measures, (6) conduct the research, and (7) revise the framework. A revised conceptual framework depicted more complicated inter-relationships of the different predisposing, enabling, reinforcing, and system-based factors. The updated framework led us to generate program theory and serves as the basis for designing future intervention studies and outcome evaluations. A CF can build a foundation for program theory. We provide a set of concrete steps and lessons learned to assist practitioners in developing a CF. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. A framework for intelligent reliability centered maintenance analysis

    International Nuclear Information System (INIS)

    Cheng Zhonghua; Jia Xisheng; Gao Ping; Wu Su; Wang Jianzhao

    2008-01-01

    To improve the efficiency of reliability-centered maintenance (RCM) analysis, case-based reasoning (CBR), as a kind of artificial intelligence (AI) technology, was successfully introduced into RCM analysis process, and a framework for intelligent RCM analysis (IRCMA) was studied. The idea for IRCMA is based on the fact that the historical records of RCM analysis on similar items can be referenced and used for the current RCM analysis of a new item. Because many common or similar items may exist in the analyzed equipment, the repeated tasks of RCM analysis can be considerably simplified or avoided by revising the similar cases in conducting RCM analysis. Based on the previous theory studies, an intelligent RCM analysis system (IRCMAS) prototype was developed. This research has focused on the description of the definition, basic principles as well as a framework of IRCMA, and discussion of critical techniques in the IRCMA. Finally, IRCMAS prototype is presented based on a case study

  19. A conceptual framework for formulating a focused and cost-effective fire protection program based on analyses of risk and the dynamics of fire effects

    International Nuclear Information System (INIS)

    Dey, M.K.

    1999-01-01

    This paper proposes a conceptual framework for developing a fire protection program at nuclear power plants based on probabilistic risk analysis (PRA) of fire hazards, and modeling the dynamics of fire effects. The process for categorizing nuclear power plant fire areas based on risk is described, followed by a discussion of fire safety design methods that can be used for different areas of the plant, depending on the degree of threat to plant safety from the fire hazard. This alternative framework has the potential to make programs more cost-effective, and comprehensive, since it will allow a more systematic and broader examination of fire risk, and provide a means to distinguish between high and low risk fire contributors. (orig.)

  20. Development of comprehensive and versatile framework for reactor analysis, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Numata, Kazuyuki; Jin, Tomoyuki

    2014-01-01

    Highlights: • We have developed a neutronics code system for reactor analysis. • The new code system covers all five phases of the core design procedures. • All the functionalities are integrated and validated in the same framework. • The framework supports continuous improvement and extension. • We report results of validation and practical applications. - Abstract: A comprehensive and versatile reactor analysis code system, MARBLE, has been developed. MARBLE is designed as a software development framework for reactor analysis, which offers reusable and extendible functions and data models based on physical concepts, rather than a reactor analysis code system. From a viewpoint of the code system, it provides a set of functionalities utilized in a detailed reactor analysis scheme for fast criticality assemblies and power reactors, and nuclear data related uncertainty quantification such as cross-section adjustment. MARBLE includes five sub-systems named ECRIPSE, BIBLO, SCHEME, UNCERTAINTY and ORPHEUS, which are constructed of the shared functions and data models in the framework. By using these sub-systems, MARBLE covers all phases required in fast reactor core design prediction and improvement procedures, i.e. integral experiment database management, nuclear data processing, fast criticality assembly analysis, uncertainty quantification, and power reactor analysis. In the present paper, these functionalities are summarized and system validation results are described

  1. The Event Coordination Notation: Execution Engine and Programming Framework

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2012-01-01

    that was written manually. In this paper, we rephrase the main concepts of ECNO. The focus of this paper, however, is on the architecture of the ECNO execution engine and its programming framework. We will show how this framework allows us to integrate ECNO with object-oriented models, how it works without any......ECNO (Event Coordination Notation) is a notation for modelling the behaviour of a software system on top of some object-oriented data model. ECNO has two main objectives: On the one hand, ECNO should allow modelling the behaviour of a system on the domain level; on the other hand, it should...... be possible to completely generate code from ECNO and the underlying object-oriented domain models. Today, there are several approaches that would allow to do this. But, most of them would require that the data models and the behaviour models are using the same technology and the code is generated together...

  2. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  3. The social impacts of dams: A new framework for scholarly analysis

    International Nuclear Information System (INIS)

    Kirchherr, Julian; Charles, Katrina J.

    2016-01-01

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  4. Supporting Intrapersonal Development in Substance Use Disorder Programs: A Conceptual Framework for Client Assessment.

    Science.gov (United States)

    Turpin, Aaron; Shier, Micheal L

    2017-01-01

    Improvements to intrapersonal development of clients involved with substance use disorder treatment programs has widely been recognized as contributing to the intended goal of reducing substance misuse behaviors. This study sought to identify a broad framework of primary outcomes related to the intrapersonal development of clients in treatment for substance misuse. Using qualitative research methods, individual interviews were conducted with program participants (n = 41) at three treatment programs to identify the ways in which respondents experienced intrapersonal development through participation in treatment. The findings support the development of a conceptual model that captures the importance and manifestation of achieving improvements in the following outcomes: self-awareness, coping ability, self-worth, outlook, and self-determination. The findings provide a conceptual framework for client assessment that captures a broad range of the important intrapersonal development factors utilized as indicators for client development and recovery that should be measured in tandem during assessment.

  5. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  6. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Science.gov (United States)

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Object-oriented data analysis framework for neutron scattering experiments

    International Nuclear Information System (INIS)

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  8. Mississippi Curriculum Framework for Medical Radiologic Technology (Radiography) (CIP: 51.0907--Medical Radiologic Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the radiologic technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the program,…

  9. Post-Foundational Discourse Analysis: A Suggestion for a Research Program

    Directory of Open Access Journals (Sweden)

    Tomas Marttila

    2015-08-01

    Full Text Available Post-foundational discourse analysis, also labeled as Essex School in Discourse Analysis, has been observed to suffer from a considerable methodological deficit that limits its applicability in empirical research. The principal aim of this article is to overcome this methodological deficit by constructing the research program of the post-foundational discourse analysis that facilitates its operationalization in empirical research. In accordance with Imre LAKATOS (1970 and David HOWARTH (2004a, a research program is referred to an internally consistent and openly scrutinizable system of theoretical, methodological and phenomenal concepts that opens up the possibility to distinguish between the "negative" and the "positive" heuristics of empirical research. The first three sections develop the positive heuristics of the post-foundational discourse analysis by elucidating its theoretical foundations, methodological position and phenomenal framework. The concluding fourth section draws on the presented positive heuristics to outline the analytical stages and strategies of the post-foundational discourse analysis and discusses suitable methods for sampling and interpreting empirical data. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs150319

  10. Effects of donor proliferation in development aid for health on health program performance: A conceptual framework.

    Science.gov (United States)

    Pallas, Sarah Wood; Ruger, Jennifer Prah

    2017-02-01

    Development aid for health increased dramatically during the past two decades, raising concerns about inefficiency and lack of coherence among the growing number of global health donors. However, we lack a framework for how donor proliferation affects health program performance to inform theory-based evaluation of aid effectiveness policies. A review of academic and gray literature was conducted. Data were extracted from the literature sample on study design and evidence for hypothesized effects of donor proliferation on health program performance, which were iteratively grouped into categories and mapped into a new conceptual framework. In the framework, increases in the number of donors are hypothesized to increase inter-donor competition, transaction costs, donor poaching of recipient staff, recipient control over aid, and donor fragmentation, and to decrease donors' sense of accountability for overall development outcomes. There is mixed evidence on whether donor proliferation increases or decreases aid volume. These primary effects in turn affect donor innovation, information hoarding, and aid disbursement volatility, as well as recipient country health budget levels, human resource capacity, and corruption, and the determinants of health program performance. The net effect of donor proliferation on health will vary depending on the magnitude of the framework's competing effects in specific country settings. The conceptual framework provides a foundation for improving design of aid effectiveness practices to mitigate negative effects from donor proliferation while preserving its potential benefits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The image of psychology programs: the value of the instrumental-symbolic framework.

    Science.gov (United States)

    Van Hoye, Greet; Lievens, Filip; De Soete, Britt; Libbrecht, Nele; Schollaert, Eveline; Baligant, Dimphna

    2014-01-01

    As competition for funding and students intensifies, it becomes increasingly important for psychology programs to have an image that is attractive and makes them stand out from other programs. The current study uses the instrumental-symbolic framework from the marketing domain to determine the image of different master's programs in psychology and examines how these image dimensions relate to student attraction and competitor differentiation. The samples consist of both potential students (N = 114) and current students (N = 68) of three psychology programs at a Belgian university: industrial and organizational psychology, clinical psychology, and experimental psychology. The results demonstrate that both instrumental attributes (e.g., interpersonal activities) and symbolic trait inferences (e.g., sincerity) are key components of the image of psychology programs and predict attractiveness as well as differentiation. In addition, symbolic image dimensions seem more important for current students of psychology programs than for potential students.

  12. Using program logic model analysis to evaluate and better deliver what works

    International Nuclear Information System (INIS)

    Megdal, Lori; Engle, Victoria; Pakenas, Larry; Albert, Scott; Peters, Jane; Jordan, Gretchen

    2005-01-01

    There is a rich history in using program theories and logic models (PT/LM) for evaluation, monitoring, and program refinement in a variety of fields, such as health care, social and education programs. The use of these tools to evaluate and improve energy efficiency programs has been growing over the last 5-7 years. This paper provides an overview of the state-of-the-art methods of logic model development, with analysis that significantly contributed to: Assessing the logic behind how the program expects to be able to meets its ultimate goals, including the 'who', the 'how', and through what mechanism. In doing so, gaps and questions that still need to be addressed can be identified. Identifying and prioritize the indicators that should be measured to evaluate the program and program theory. Determining key researchable questions that need to be answered by evaluation/research, to assess whether the mechanism assumed to cause the changes in actions, attitudes, behaviours, and business practices is workable and efficient. Also will assess the validity in the program logic and the likelihood that the program can accomplish its ultimate goals. Incorporating analysis of prior like programs and social science theories in a framework to identify opportunities for potential program refinements. The paper provides an overview of the tools, techniques and references, and uses as example the energy efficiency program analysis conducted for the New York State Energy Research and Development Authority's (NYSERDA) New York ENERGY $MART SM programs

  13. The social impacts of dams: A new framework for scholarly analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kirchherr, Julian, E-mail: julian.kirchherr@sant.ox.ac.uk; Charles, Katrina J., E-mail: katrina.charles@ouce.ox.ac.uk

    2016-09-15

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  14. Mississippi Curriculum Framework for Dental Hygiene Technology (Program CIP: 51.0602--Dental Hygienist). Postsecondary Education.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the dental hygiene technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies. Section II…

  15. Second generation CO2 FEP analysis: Cassifcarbon sequestration scenario identification framework

    NARCIS (Netherlands)

    Yavuz, F.T.; Tilburg, T. van; Pagnier, H.

    2008-01-01

    A novel scenario analysis framework has been created, called Carbon Sequestration Scenario Identification Framework (CASSIF). This framework addresses containment performance defined by the three major categories: well, fault and seal integrity. The relevant factors that influence the integrity are

  16. Ecosystem Analysis Program

    International Nuclear Information System (INIS)

    Burgess, R.L.

    1978-01-01

    Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models

  17. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  18. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  19. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    Science.gov (United States)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  20. A Conceptual Framework over Contextual Analysis of Concept Learning within Human-Machine Interplays

    DEFF Research Database (Denmark)

    Badie, Farshad

    2016-01-01

    This research provides a contextual description concerning existential and structural analysis of ‘Relations’ between human beings and machines. Subsequently, it will focus on conceptual and epistemological analysis of (i) my own semantics-based framework [for human meaning construction] and of (ii......) a well-structured machine concept learning framework. Accordingly, I will, semantically and epistemologically, focus on linking those two frameworks for logical analysis of concept learning in the context of human-machine interrelationships. It will be demonstrated that the proposed framework provides...

  1. Abstract Interpretation of PIC programs through Logic Programming

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    , are applied to the logic based model of the machine. A small PIC microcontroller is used as a case study. An emulator for this microcontroller is written in Prolog, and standard programming transformations and analysis techniques are used to specialise this emulator with respect to a given PIC program....... The specialised emulator can now be further analysed to gain insight into the given program for the PIC microcontroller. The method describes a general framework for applying abstractions, illustrated here by linear constraints and convex hull analysis, to logic programs. Using these techniques on the specialised...

  2. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  3. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  4. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  5. Using the RE-AIM framework to evaluate physical activity public health programs in México.

    Science.gov (United States)

    Jauregui, Edtna; Pacheco, Ann M; Soltero, Erica G; O'Connor, Teresia M; Castro, Cynthia M; Estabrooks, Paul A; McNeill, Lorna H; Lee, Rebecca E

    2015-02-19

    Physical activity (PA) public health programming has been widely used in Mexico; however, few studies have documented individual and organizational factors that might be used to evaluate their public health impact. The RE-AIM framework is an evaluation tool that examines individual and organizational factors of public health programs. The purpose of this study was to use the RE-AIM framework to determine the degree to which PA programs in Mexico reported individual and organizational factors and to investigate whether reporting differed by the program's funding source. Public health programs promoting PA were systematically identified during 2008-2013 and had to have an active program website. Initial searches produced 23 possible programs with 12 meeting inclusion criteria. A coding sheet was developed to capture behavioral, outcome and RE-AIM indicators from program websites. In addition to targeting PA, five (42%) programs also targeted dietary habits and the most commonly reported outcome was change in body composition (58%). Programs reported an average of 11.1 (±3.9) RE-AIM indicator items (out of 27 total). On average, 45% reported reach indicators, 34% reported efficacy/effectiveness indicators, 60% reported adoption indicators, 40% reported implementation indicators, and 35% reported maintenance indicators. The proportion of RE-AIM indicators reported did not differ significantly for programs that were government supported (M = 10, SD = 3.1) and programs that were partially or wholly privately or corporately supported (M = 12.0, SD = 4.4). While reach and adoption of these programs were most commonly reported, there is a need for stronger evaluation of behavioral and health outcomes before the public health impact of these programs can be established.

  6. Runtime Detection Framework for Android Malware

    Directory of Open Access Journals (Sweden)

    TaeGuen Kim

    2018-01-01

    Full Text Available As the number of Android malware has been increased rapidly over the years, various malware detection methods have been proposed so far. Existing methods can be classified into two categories: static analysis-based methods and dynamic analysis-based methods. Both approaches have some limitations: static analysis-based methods are relatively easy to be avoided through transformation techniques such as junk instruction insertions, code reordering, and so on. However, dynamic analysis-based methods also have some limitations that analysis overheads are relatively high and kernel modification might be required to extract dynamic features. In this paper, we propose a dynamic analysis framework for Android malware detection that overcomes the aforementioned shortcomings. The framework uses a suffix tree that contains API (Application Programming Interface subtraces and their probabilistic confidence values that are generated using HMMs (Hidden Markov Model to reduce the malware detection overhead, and we designed the framework with the client-server architecture since the suffix tree is infeasible to be deployed in mobile devices. In addition, an application rewriting technique is used to trace API invocations without any modifications in the Android kernel. In our experiments, we measured the detection accuracy and the computational overheads to evaluate its effectiveness and efficiency of the proposed framework.

  7. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    opportunities, a generic modelling framework is proposed to handle this task. This framework outlines a set of building blocks which are necessary for carrying out the economic analysis of various BS applications. Further, special focus is given on describing how to use the rainflow cycle counting algorithm...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so......Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...

  8. Solving stochastic programs with integer recourse by enumeration : a framework using Gröbner basis reductions

    NARCIS (Netherlands)

    Schultz, R.; Stougie, L.; Vlerk, van der M.H.

    1998-01-01

    In this paper we present a framework for solving stochastic programs with complete integer recourse and discretely distributed right-hand side vector, using Gröbner basis methods from computational algebra to solve the numerous second-stage integer programs. Using structural properties of the

  9. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent. Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  10. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent.    Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  11. Does the knowledge-to-action (KTA) framework facilitate physical demands analysis development for firefighter injury management and return-to-work planning?

    Science.gov (United States)

    Sinden, Kathryn; MacDermid, Joy C

    2014-03-01

    Employers are tasked with developing injury management and return-to-work (RTW) programs in response to occupational health and safety policies. Physical demands analyses (PDAs) are the cornerstone of injury management and RTW development. Synthesizing and contextualizing policy knowledge for use in occupational program development, including PDAs, is challenging due to multiple stakeholder involvement. Few studies have used a knowledge translation theoretical framework to facilitate policy-based interventions in occupational contexts. The primary aim of this case study was to identify how constructs of the knowledge-to-action (KTA) framework were reflected in employer stakeholder-researcher collaborations during development of a firefighter PDA. Four stakeholder meetings were conducted with employee participants who had experience using PDAs in their occupational role. Directed content analysis informed analyses of meeting minutes, stakeholder views and personal reflections recorded throughout the case. Existing knowledge sources including local data, stakeholder experiences, policies and priorities were synthesized and tailored to develop a PDA in response to the barriers and facilitators identified by the firefighters. The flexibility of the KTA framework and synthesis of multiple knowledge sources were identified strengths. The KTA Action cycle was useful in directing the overall process but insufficient for directing the specific aspects of PDA development. Integration of specific PDA guidelines into the process provided explicit direction on best practices in tailoring the PDA and knowledge synthesis. Although the themes of the KTA framework were confirmed in our analysis, order modification of the KTA components was required. Despite a complex context with divergent perspectives successful implementation of a draft PDA was achieved. The KTA framework facilitated knowledge synthesis and PDA development but specific standards and modifications to the KTA

  12. Strategy analysis frameworks for strategy orientation and focus

    OpenAIRE

    Isoherranen, V. (Ville)

    2012-01-01

    Abstract The primary research target of this dissertation is to develop new strategy analysis frameworks, focusing on analysing changes in strategic position as a function of variations in life cycle s-curve/time/typology/market share/orientation. Research is constructive and qualitative by nature, with case study methodology being the adopted approach. The research work is carried out as a compilation dissertation containing four (4) journal articles. The theoretical framework of thi...

  13. Polyglot programming in applications used for genetic data analysis.

    Science.gov (United States)

    Nowak, Robert M

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.

  14. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  15. Self-insurance and worksite alcohol programs: an econometric analysis.

    Science.gov (United States)

    Kenkel, D S

    1997-03-01

    The worksite is an important point of access for alcohol treatment and prevention, but not all firms are likely to find offering alcohol programs profitable. This study attempts to identify at a conceptual and empirical level factors that are important determinants of the profitability of worksite alcohol programs. A central question considered in the empirical analysis is whether firms' decisions about worksite alcohol programs are related to how employee group health insurance is provided. The data used are from the 1992 National Survey of Worksite Health Promotion Activities (N = 1,389-1,412). The econometric analysis focuses on measures of whether the surveyed firms offer Employee Assistance Programs (EAPs), individual counseling, group classes and resource materials regarding alcohol and other substance abuse. Holding other factors constant, the probability that a self-insured firm offers an EAP is estimated to be 59%, compared to 51% for a firm that purchases market group health insurance for its employees. Unionized worksites and larger worksites are also found to be more likely to offer worksite alcohol programs, compared to nonunionized smaller worksites. Worksites with younger work-forces are less likely than those with older employees to offer alcohol programs. The empirical results are consistent with the conceptual framework from labor economics, since self-insurance is expected to increase firms' demand for worksite alcohol programs while large worksite is expected to reduce the average program cost. The role of union status and workforce age suggests it is important to consider workers' preferences for the programs as fringe benefits. The results also suggest that the national trend towards self-insurance may be leading to more prevention and treatment of worker alcohol-related problems.

  16. Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework

    Science.gov (United States)

    Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.

    2017-12-01

    The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.

  17. OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods

    Science.gov (United States)

    Heath, Christopher M.; Gray, Justin S.

    2012-01-01

    The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.

  18. Subseabed-disposal program: systems-analysis program plan

    International Nuclear Information System (INIS)

    Klett, R.D.

    1981-03-01

    This report contains an overview of the Subseabed Nuclear Waste Disposal Program systems analysis program plan, and includes sensitivity, safety, optimization, and cost/benefit analyses. Details of the primary barrier sensitivity analysis and the data acquisition and modeling cost/benefit studies are given, as well as the schedule through the technical, environmental, and engineering feasibility phases of the program

  19. Mississippi Curriculum Framework for Diesel Equipment Technology (CIP: 47.0605--Diesel Engine Mechanic & Repairer). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the diesel equipment technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…

  20. Needs Analysis and Course Design; A Framework for Designing Exam Courses

    Directory of Open Access Journals (Sweden)

    Reza Eshtehardi

    2017-09-01

    Full Text Available This paper introduces a framework for designing exam courses and highlights the importance of needs analysis in designing exam courses. The main objectives of this paper are to highlight the key role of needs analysis in designing exam courses, to offer a framework for designing exam courses, to show the language needs of different students for IELTS (International English Language Testing System exam, to offer an analysis of those needs and to explain how they will be taken into account for the design of the course. First, I will concentrate on some distinguishing features in exam classes, which make them different from general English classes. Secondly, I will introduce a framework for needs analysis and diagnostic testing and highlight the importance of needs analysis for the design of syllabus and language courses. Thirdly, I will describe significant features of syllabus design, course assessment, and evaluation procedures.

  1. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    Science.gov (United States)

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  2. Building an Evaluation Framework for a Competency-Based Graduate Program at the University Of Southern Mississippi

    Science.gov (United States)

    Gaudet, Cyndi H.; Annulis, Heather M.; Kmiec, John J., Jr.

    2008-01-01

    This article describes an ongoing project to build a comprehensive evaluation framework for the competency-based Master of Science in Workforce Training and Development (MSWTD) program at The University of Southern Mississippi (USM). First, it discusses some trends and issues in evaluating the performance of higher education programs in the United…

  3. WWW-based remote analysis framework for UniSampo and Shaman analysis software

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Ala-Heikkilae, J.J.; Routti, J.T.; Nikkinen, M.T.

    2005-01-01

    UniSampo and Shaman are well-established analytical tools for gamma-ray spectrum analysis and the subsequent radionuclide identification. These tools are normally run locally on a Unix or Linux workstation in interactive mode. However, it is also possible to run them in batch/non-interactive mode by starting them with the correct parameters. This is how they are used in the standard analysis pipeline operation. This functionality also makes it possible to use them for remote operation over the network. Framework for running UniSampo and Shaman analysis using the standard WWW-protocol has been developed. A WWW-server receives requests from the client WWW-browser and runs the analysis software via a set of CGI-scripts. Authentication, input data transfer, and output and display of the final analysis results is all carried out using standard WWW-mechanisms. This WWW-framework can be utilized, for example, by organizations that have radioactivity surveillance stations in a wide area. A computer with a standard internet/intranet connection suffices for on-site analyses. (author)

  4. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  5. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  6. High Speed Simulation Framework for Reliable Logic Programs

    International Nuclear Information System (INIS)

    Lee, Wan-Bok; Kim, Seog-Ju

    2006-01-01

    This paper shows a case study of designing a PLC logic simulator that was developed to simulate and verify PLC control programs for nuclear plant systems. The nuclear control system requires strict restrictions rather than normal process control system does, since it works with nuclear power plants requiring high reliability under severe environment. One restriction is the safeness of the control programs which can be assured by exploiting severe testing. Another restriction is the simulation speed of the control programs, that should be fast enough to control multi devices concurrently in real-time. To cope with these restrictions, we devised a logic compiler which generates C-code programs from given PLC logic programs. Once the logic program was translated into C-code, the program could be analyzed by conventional software analysis tools and could be used to construct a fast logic simulator after cross-compiling, in fact, that is a kind of compiled-code simulation

  7. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  8. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  9. The radiation protection research within the fourth Framework Program of the European Union (1994-1998)

    International Nuclear Information System (INIS)

    Siunaeve, J.; Mingot, F.; Arranz, L.; Cancio, D.

    1995-01-01

    The next research program on Radiation Protection within the Fourth Framework Program of the European Union has been approved by the Council last December (O.I.N L 361, 12/31/94). The program includes important changes in its structure as well as in the way for implementation in Europe. The most important change is that the main activities concerning Nuclear Safety, Waste Management and Radiation Protection have been included in a single program called Nuclear Fission Safety. The program also includes specific work with CIS countries for the management of Chernobyl consequences as well as other significative contaminations in other areas of the former Soviet Union. (Author)

  10. Environmental Education Organizations and Programs in Texas: Identifying Patterns through a Database and Survey Approach for Establishing Frameworks for Assessment and Progress

    Science.gov (United States)

    Lloyd-Strovas, Jenny D.; Arsuffi, Thomas L.

    2016-01-01

    We examined the diversity of environmental education (EE) in Texas, USA, by developing a framework to assess EE organizations and programs at a large scale: the Environmental Education Database of Organizations and Programs (EEDOP). This framework consisted of the following characteristics: organization/visitor demographics, pedagogy/curriculum,…

  11. Sustainability of ARV provision in developing countries: challenging a framework based on program history

    Directory of Open Access Journals (Sweden)

    Thiago Botelho Azeredo

    Full Text Available Abstract The provision of ARVs is central to HIV/AIDS programs, because of its impact on the course of the disease and on quality of life. Although first-line treatments costs have declined, treatment-associated expenses are steeper each year. Sustainability is therefore an important variable for the success of treatment programs. A conceptual framework on sustainability of ARV provision was developed, followed by data collection instruments. The pilot study was undertaken in Brazil. Bolivia, Peru and Mozambique, were visited. Key informants were identified and interviewed. Investigation of sustainability related to ARV provision involved implementation and routinization events of provision schemes. Evidence of greater sustainability potential was observed in Peru, where provision is implemented and routinized by the National HIV/AIDS program and expenditures met by the government. In Mozambique, provision is dependent on donations and external aid, but the country displays a great effort to incorporate ARV provision and care in routine healthcare activities. Bolivia, in addition to external dependence on financing and management of drug supply, presents problems regarding implementation and routinization. The conceptual framework was useful in recognizing events that influence sustainable ARV provision in these countries.

  12. A 3-month jump-landing training program: a feasibility study using the RE-AIM framework.

    Science.gov (United States)

    Aerts, Inne; Cumps, Elke; Verhagen, Evert; Mathieu, Niels; Van Schuerbeeck, Sander; Meeusen, Romain

    2013-01-01

    Evaluating the translatability and feasibility of an intervention program has become as important as determining the effectiveness of the intervention. To evaluate the applicability of a 3-month jump-landing training program in basketball players, using the RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) framework. Randomized controlled trial. National and regional basketball teams. Twenty-four teams of the second highest national division and regional basketball divisions in Flanders, Belgium, were randomly assigned (1:1) to a control group and intervention group. A total of 243 athletes (control group = 129, intervention group = 114), ages 15 to 41 years, volunteered. All exercises in the intervention program followed a progressive development, emphasizing lower extremity alignment during jump-landing activities. The results of the process evaluation of the intervention program were based on the 5 dimensions of the RE-AIM framework. The injury incidence density, hazard ratios, and 95% confidence intervals were determined. The participation rate of the total sample was 100% (reach). The hazard ratio was different between the intervention group and the control group (0.40 [95% confidence interval = 0.16, 0.99]; effectiveness). Of the 12 teams in the intervention group, 8 teams (66.7%) agreed to participate in the study (adoption). Eight of the participating coaches (66.7%) felt positively about the intervention program and stated that they had implemented the training sessions of the program as intended (implementation). All coaches except 1 (87.5%) intended to continue the intervention program the next season (maintenance). Compliance of the coaches in this coach-supervised jump-landing training program was high. In addition, the program was effective in preventing lower extremity injuries.

  13. LULU analysis program

    International Nuclear Information System (INIS)

    Crawford, H.J.; Lindstrom, P.J.

    1983-06-01

    Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday

  14. UNC-Utah NA-MIC framework for DTI fiber tract analysis.

    Science.gov (United States)

    Verde, Audrey R; Budin, Francois; Berger, Jean-Baptiste; Gupta, Aditya; Farzinfar, Mahshid; Kaiser, Adrien; Ahn, Mihye; Johnson, Hans; Matsui, Joy; Hazlett, Heather C; Sharma, Anuja; Goodlett, Casey; Shi, Yundi; Gouttard, Sylvain; Vachet, Clement; Piven, Joseph; Zhu, Hongtu; Gerig, Guido; Styner, Martin

    2014-01-01

    Diffusion tensor imaging has become an important modality in the field of neuroimaging to capture changes in micro-organization and to assess white matter integrity or development. While there exists a number of tractography toolsets, these usually lack tools for preprocessing or to analyze diffusion properties along the fiber tracts. Currently, the field is in critical need of a coherent end-to-end toolset for performing an along-fiber tract analysis, accessible to non-technical neuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents a coherent, open source, end-to-end toolset for atlas fiber tract based DTI analysis encompassing DICOM data conversion, quality control, atlas building, fiber tractography, fiber parameterization, and statistical analysis of diffusion properties. Most steps utilize graphical user interfaces (GUI) to simplify interaction and provide an extensive DTI analysis framework for non-technical researchers/investigators. We illustrate the use of our framework on a small sample, cross sectional neuroimaging study of eight healthy 1-year-old children from the Infant Brain Imaging Study (IBIS) Network. In this limited test study, we illustrate the power of our method by quantifying the diffusion properties at 1 year of age on the genu and splenium fiber tracts.

  15. Big data analysis framework for healthcare and social sectors in Korea.

    Science.gov (United States)

    Song, Tae-Min; Ryu, Seewon

    2015-01-01

    We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached.

  16. THE STAR OFFLINE FRAMEWORK

    International Nuclear Information System (INIS)

    FINE, V.; FISYAK, Y.; PEREVOZTCHIKOV, V.; WENAUS, T.

    2000-01-01

    The Solenoidal Tracker At RHIC (STAR) is a-large acceptance collider detector, commissioned at Brookhaven National Laboratory in 1999. STAR has developed a software framework supporting simulation, reconstruction and analysis in offline production, interactive physics analysis and online monitoring environments that is well matched both to STAR's present status of transition between Fortran and C++ based software and to STAR's evolution to a fully OO software base. This paper presents the results of two years effort developing a modular C++ framework based on the ROOT package that encompasses both wrapped Fortran components (legacy simulation and reconstruction code) served by IDL-defined data structures, and fully OO components (all physics analysis code) served by a recently developed object model for event data. The framework supports chained components, which can themselves be composite subchains, with components (''makers'') managing ''data sets'' they have created and are responsible for. An St-DataSet class from which data sets and makers inherit allows the construction of hierarchical organizations of components and data, and centralizes almost all system tasks such as data set navigation, I/O, database access, and inter-component communication. This paper will present an overview of this system, now deployed and well exercised in production environments with real and simulated data, and in an active physics analysis development program

  17. Risk and train control : a framework for analysis

    Science.gov (United States)

    2001-01-01

    This report develops and demonstrates a framework for examining the effects of various train control strategies on some of the major risks of railroad operations. Analysis of hypothetical 1200-mile corridor identified the main factors that increase r...

  18. The international framework for safeguarding peaceful nuclear energy programs

    International Nuclear Information System (INIS)

    Mazer, B.M.

    1980-01-01

    International law, in response to the need for safeguard assurances, has provided a framework which can be utilized by supplier and recipient states. Multilateral treaties have created the International Atomic Energy Agency which can serve a vital role in the establishment and supervision of safeguard agreements for nuclear energy programs. The Non-Proliferation Treaty has created definite obligations on nuclear-weapon and non-nuclear weapon states to alleviate some possibilities of proliferation and has rejuvenated the function of the IAEA in providing safeguards, especially to non-nuclear-weapon states which are parties to the Non-Proliferation treaty. States which are not parties to the Non-Proliferation Treaty may receive nuclear energy co-operation subject to IAEA safeguards. States like Canada, have insisted through the bilateral nuclear energy co-operation agreements that either individual or joint agreement be reached with the IAEA for the application of safeguards. Trilateral treaties among Canada, the recipient state and the IAEA have been employed and can provide the necessary assurances against the diversion of peaceful nuclear energy programs to military or non-peaceful uses. The advent of the Nuclear Suppliers Group and its guidlines has definitely advanced the cause of ensuring peaceful uses of nuclear energy. The ultimate objective should be the creation of an international structure incorporating the application of the most comprehensive safeguards which will be applied universally to all nuclear energy programs

  19. Transactional Analysis: Conceptualizing a Framework for Illuminating Human Experience

    Directory of Open Access Journals (Sweden)

    Trevor Thomas Stewart PhD

    2011-09-01

    Full Text Available Myriad methods exist for analyzing qualitative data. It is, however, imperative for qualitative researchers to employ data analysis tools that are congruent with the theoretical frameworks underpinning their inquiries. In this paper, I have constructed a framework for analyzing data that could be useful for researchers interested in focusing on the transactional nature of language as they engage in Social Science research. Transactional Analysis (TA is an inductive approach to data analysis that transcends constant comparative methods of exploring data. Drawing on elements of narrative and thematic analysis, TA uses the theories of Bakhtin and Rosenblatt to attend to the dynamic processes researchers identify as they generate themes in their data and seek to understand how their participants' worldviews are being shaped. This paper highlights the processes researchers can utilize to study the mutual shaping that occurs as participants read and enter into dialogue with the world around them.

  20. A hybrid Constraint Programming/Mixed Integer Programming framework for the preventive signaling maintenance crew scheduling problem

    DEFF Research Database (Denmark)

    Pour, Shahrzad M.; Drake, John H.; Ejlertsen, Lena Secher

    2017-01-01

    A railway signaling system is a complex and interdependent system which should ensure the safe operation of trains. We introduce and address a mixed integer optimisation model for the preventive signal maintenance crew scheduling problem in the Danish railway system. The problem contains many...... to feed as ‘warm start’ solutions to a Mixed Integer Programming (MIP) solver for further optimisation. We apply the CP/MIP framework to a section of the Danish rail network and benchmark our results against both direct application of a MIP solver and modelling the problem as a Constraint Optimisation...

  1. A Program Transformation for Backwards Analysis of Logic Programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2003-01-01

    The input to backwards analysis is a program together with properties that are required to hold at given program points. The purpose of the analysis is to derive initial goals or pre-conditions that guarantee that, when the program is executed, the given properties hold. The solution for logic...... programs presented here is based on a transformation of the input program, which makes explicit the dependencies of the given program points on the initial goals. The transformation is derived from the resultants semantics of logic programs. The transformed program is then analysed using a standard...

  2. Flexible Human Behavior Analysis Framework for Video Surveillance Applications

    Directory of Open Access Journals (Sweden)

    Weilun Lao

    2010-01-01

    Full Text Available We study a flexible framework for semantic analysis of human motion from surveillance video. Successful trajectory estimation and human-body modeling facilitate the semantic analysis of human activities in video sequences. Although human motion is widely investigated, we have extended such research in three aspects. By adding a second camera, not only more reliable behavior analysis is possible, but it also enables to map the ongoing scene events onto a 3D setting to facilitate further semantic analysis. The second contribution is the introduction of a 3D reconstruction scheme for scene understanding. Thirdly, we perform a fast scheme to detect different body parts and generate a fitting skeleton model, without using the explicit assumption of upright body posture. The extension of multiple-view fusion improves the event-based semantic analysis by 15%–30%. Our proposed framework proves its effectiveness as it achieves a near real-time performance (13–15 frames/second and 6–8 frames/second for monocular and two-view video sequences.

  3. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    Science.gov (United States)

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  4. A framework for sensitivity analysis of decision trees.

    Science.gov (United States)

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  5. The Impact of a "Framework"-Aligned Science Professional Development Program on Literacy and Mathematics Achievement of K-3 Students

    Science.gov (United States)

    Paprzycki, Peter; Tuttle, Nicole; Czerniak, Charlene M.; Molitor, Scott; Kadervaek, Joan; Mendenhall, Robert

    2017-01-01

    This study investigates the effect of a Framework-aligned professional development program at the PreK-3 level. The NSF funded program integrated science with literacy and mathematics learning and provided teacher professional development, along with materials and programming for parents to encourage science investigations and discourse around…

  6. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  7. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  8. Using the Five Senses of Success framework to understand the experiences of midwifery students enroled in an undergraduate degree program.

    Science.gov (United States)

    Sidebotham, M; Fenwick, J; Carter, A; Gamble, J

    2015-01-01

    developing a student's sense of capability, purpose, resourcefulness, identity and connectedness (five-senses of success) are key factors that may be important in predicting student satisfaction and progression within their university program. the study aimed to examine the expectations and experiences of second and third year midwifery students enroled in a Bachelor of Midwifery program and identify barriers and enablers to success. a descriptive exploratory qualitative design was used. Fifty-six students enroled in either year 2 or 3 of the Bachelor of Midwifery program in SE Queensland participated in an anonymous survey using open-ended questions. In addition, 16 students participated in two year-level focus groups. Template analysis, using the Five Senses Framework, was used to analyse the data set. early exposure to 'hands on' clinical midwifery practice as well as continuity of care experiences provided students with an opportunity to link theory to practice and increased their perception of capability as they transitioned through the program. Students' sense of identity, purpose, resourcefulness, and capability was strongly influenced by the programs embedded meta-values, including a 'woman centred' approach. In addition, a student's ability to form strong positive relationships with women, peers, lecturers and supportive clinicians was central to developing connections and ultimately a sense of success. A sense of connection not only fostered an ongoing belief that challenges could be overcome but that students themselves could initiate or influence change. the five senses framework provided a useful lens through which to analyse the student experience. Key factors to student satisfaction and retention within a Bachelor of Midwifery program include: a clearly articulated midwifery philosophy, strategies to promote student connectedness including the use of social media, and further development of clinicians' skills in preceptorship, clinical teaching and

  9. A model-based framework for the analysis of team communication in nuclear power plants

    International Nuclear Information System (INIS)

    Chung, Yun Hyung; Yoon, Wan Chul; Min, Daihwan

    2009-01-01

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants

  10. Supportive supervision and constructive relationships with healthcare workers support CHW performance: Use of a qualitative framework to evaluate CHW programming in Uganda.

    Science.gov (United States)

    Ludwick, Teralynn; Turyakira, Eleanor; Kyomuhangi, Teddy; Manalili, Kimberly; Robinson, Sheila; Brenner, Jennifer L

    2018-02-13

    While evidence supports community health worker (CHW) capacity to improve maternal and newborn health in less-resourced countries, key implementation gaps remain. Tools for assessing CHW performance and evidence on what programmatic components affect performance are lacking. This study developed and tested a qualitative evaluative framework and tool to assess CHW team performance in a district program in rural Uganda. A new assessment framework was developed to collect and analyze qualitative evidence based on CHW perspectives on seven program components associated with effectiveness (selection; training; community embeddedness; peer support; supportive supervision; relationship with other healthcare workers; retention and incentive structures). Focus groups were conducted with four high/medium-performing CHW teams and four low-performing CHW teams selected through random, stratified sampling. Content analysis involved organizing focus group transcripts according to the seven program effectiveness components, and assigning scores to each component per focus group. Four components, 'supportive supervision', 'good relationships with other healthcare workers', 'peer support', and 'retention and incentive structures' received the lowest overall scores. Variances in scores between 'high'/'medium'- and 'low'-performing CHW teams were largest for 'supportive supervision' and 'good relationships with other healthcare workers.' Our analysis suggests that in the Bushenyi intervention context, CHW team performance is highly correlated with the quality of supervision and relationships with other healthcare workers. CHWs identified key performance-related issues of absentee supervisors, referral system challenges, and lack of engagement/respect by health workers. Other less-correlated program components warrant further study and may have been impacted by relatively consistent program implementation within our limited study area. Applying process-oriented measurement tools are

  11. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  12. A metric and frameworks for resilience analysis of engineered and infrastructure systems

    International Nuclear Information System (INIS)

    Francis, Royce; Bekera, Behailu

    2014-01-01

    In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports

  13. Liquid Effluents Program mission analysis

    International Nuclear Information System (INIS)

    Lowe, S.S.

    1994-01-01

    Systems engineering is being used to identify work to cleanup the Hanford Site. The systems engineering process transforms an identified mission need into a set of performance parameters and a preferred system configuration. Mission analysis is the first step in the process. Mission analysis supports early decision-making by clearly defining the program objectives, and evaluating the feasibility and risks associated with achieving those objectives. The results of the mission analysis provide a consistent basis for subsequent systems engineering work. A mission analysis was performed earlier for the overall Hanford Site. This work was continued by a ''capstone'' team which developed a top-level functional analysis. Continuing in a top-down manner, systems engineering is now being applied at the program and project levels. A mission analysis was conducted for the Liquid Effluents Program. The results are described herein. This report identifies the initial conditions and acceptable final conditions, defines the programmatic and physical interfaces and sources of constraints, estimates the resources to carry out the mission, and establishes measures of success. The mission analysis reflects current program planning for the Liquid Effluents Program as described in Liquid Effluents FY 1995 Multi-Year Program Plan

  14. An Examination of New Counselor Mentor Programs

    Science.gov (United States)

    Bass, Erin; Gardner, Lauren; Onwukaeme, Chika; Revere, Dawn; Shepherd, Denise; Parrish, Mark S.

    2013-01-01

    An analysis of current new counselor mentor programs reveals the need for such programs, but information regarding established programs is limited. A review of the literature addresses program characteristics and data obtained from existing mentor program participants. An overview of four programs explaining the framework outlined for mentoring…

  15. Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.

    Science.gov (United States)

    Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha

    2017-07-01

    As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well

  16. A mathematical programming framework for early stage design of wastewater treatment plants

    DEFF Research Database (Denmark)

    Bozkurt, Hande; Quaglia, Alberto; Gernaey, Krist

    2015-01-01

    The increasing number of alternative wastewater treatment technologies and stricter effluent requirements make the optimal treatment process selection for wastewater treatment plant design a complicated problem. This task, defined as wastewater treatment process synthesis, is currently based on e...... the design problem is formulated as a Mixed Integer (Non)linear Programming problem e MI(N)LP e and solved. A case study is formulated and solved to highlight the application of the framework. © 2014 Elsevier Ltd. All rights reserved....... on expert decisions and previous experiences. This paper proposes a new approach based on mathematical programming to manage the complexity of the problem. The approach generates/identifies novel and optimal wastewater treatment process selection, and the interconnection between unit operations to create...

  17. Accuracy of an efficient framework for structural analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert D.; Fedorov, Vladimir

    2016-01-01

    -section analysis tool is able to capture the effects stemming from material anisotropy and inhomogeneity for sections of arbitrary geometry. The proposed framework is very efficient and therefore ideally suited for integration within wind turbine aeroelastic design and analysis tools. A number of benchmark......This paper presents a novel framework for the structural design and analysis of wind turbine blades and establishes its accuracy. The framework is based on a beam model composed of two parts—a 2D finite element-based cross-section analysis tool and a 3D beam finite element model. The cross...... examples are presented comparing the results from the proposed beam model to 3D shell and solid finite element models. The examples considered include a square prismatic beam, an entire wind turbine rotor blade and a detailed wind turbine blade cross section. Phenomena at both the blade length scale...

  18. Toward an Evaluation Framework for Doctoral Education in Social Work: A 10-Year Retrospective of One PhD Program's Assessment Experiences

    Science.gov (United States)

    Bentley, Kia J.

    2013-01-01

    This article presents a framework for evaluation in social work doctoral education and details 10 years of successes and challenges in one PhD program's use of the framework, including planning and implementing specific assessment activities around student learning outcomes and larger program goals. The article argues that a range of…

  19. Frameworks in CS1

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Caspersen, Michael Edelgaard

    2002-01-01

    point for introducing graphical user interface frameworks such as Java Swing and AWT as the students are not overwhelmed by all the details of such frameworks right away but given a conceptual road-map and practical experience that allow them to cope with the complexity.......In this paper we argue that introducing object-oriented frameworks as subject already in the CS1 curriculum is important if we are to train the programmers of tomorrow to become just as much software reusers as software producers. We present a simple, graphical, framework that we have successfully...... used to introduce the principles of object-oriented frameworks to students at the introductory programming level. Our framework, while simple, introduces central abstractions such as inversion of control, event-driven programming, and variability points/hot-spots. This has provided a good starting...

  20. Innovation and entrepreneurship programs in US medical education: a landscape review and thematic analysis.

    Science.gov (United States)

    Niccum, Blake A; Sarker, Arnab; Wolf, Stephen J; Trowbridge, Matthew J

    2017-01-01

    Training in innovation and entrepreneurship (I&E) in medical education has become increasingly prevalent among medical schools to train students in complex problem solving and solution design. We aim to characterize I&E education in US allopathic medical schools to provide insight into the features and objectives of this growing field. I&E programs were identified in 2016 via structured searches of 158 US allopathic medical school websites. Program characteristics were identified from public program resources and structured phone interviews with program directors. Curricular themes were identified via thematic analysis of program resources, and themes referenced by >50% of programs were analyzed. Thirteen programs were identified. Programs had a median age of four years, and contained a median of 13 students. Programs were led by faculty from diverse professional backgrounds, and all awarded formal recognition to graduates. Nine programs spanned all four years of medical school and ten programs required a capstone project. Thematic analysis revealed seven educational themes (innovation, entrepreneurship, technology, leadership, healthcare systems, business of medicine, and enhanced adaptability) and two teaching method themes (active learning, interdisciplinary teaching) referenced by >50% of programs. The landscape of medical school I&E programs is rapidly expanding to address newfound skills needed by physicians due to ongoing changes in healthcare, but programs remain relatively few and small compared to class size. This landscape analysis is the first review of I&E in medical education and may contribute to development of a formal educational framework or competency model for current or future programs. AAMC: American Association of Medical Colleges; AMA: American Medical Association; I&E: Innovation and entrepreneurship.

  1. A Probabilistic Analysis Framework for Malicious Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Kammuller, Florian; Nemli, Ibrahim

    2015-01-01

    Malicious insider threats are difficult to detect and to mitigate. Many approaches for explaining behaviour exist, but there is little work to relate them to formal approaches to insider threat detection. In this work we present a general formal framework to perform analysis for malicious insider...

  2. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. UNC-Utah NA-MIC Framework for DTI Fiber Tract Analysis

    Directory of Open Access Journals (Sweden)

    Audrey Rose Verde

    2014-01-01

    Full Text Available Diffusion tensor imaging has become an important modality in the field ofneuroimaging to capture changes in micro-organization and to assess white matterintegrity or development. While there exists a number of tractography toolsets,these usually lack tools for preprocessing or to analyze diffusion properties alongthe fiber tracts. Currently, the field is in critical need of a coherent end-to-endtoolset for performing an along-fiber tract analysis, accessible to non-technicalneuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents acoherent, open source, end-to-end toolset for atlas fiber tract based DTI analysisencompassing DICOM data conversion, quality control, atlas building, fibertractography, fiber parameterization, and statistical analysis of diffusionproperties. Most steps utilize graphical user interfaces (GUI to simplifyinteraction and provide an extensive DTI analysis framework for non-technicalresearchers/investigators. We illustrate the use of our framework on a smallsample, cross sectional neuroimaging study of 8 healthy 1-year-old children fromthe Infant Brain Imaging Study (IBIS Network. In this limited test study, weillustrate the power of our method by quantifying the diffusion properties at 1year of age on the genu and splenium fiber tracts.

  4. Crisis Reliability Indicators Supporting Emergency Services (CRISES): A Framework for Developing Performance Measures for Behavioral Health Crisis and Psychiatric Emergency Programs.

    Science.gov (United States)

    Balfour, Margaret E; Tanner, Kathleen; Jurica, Paul J; Rhoads, Richard; Carson, Chris A

    2016-01-01

    Crisis and emergency psychiatric services are an integral part of the healthcare system, yet there are no standardized measures for programs providing these services. We developed the Crisis Reliability Indicators Supporting Emergency Services (CRISES) framework to create measures that inform internal performance improvement initiatives and allow comparison across programs. The framework consists of two components-the CRISES domains (timely, safe, accessible, least-restrictive, effective, consumer/family centered, and partnership) and the measures supporting each domain. The CRISES framework provides a foundation for development of standardized measures for the crisis field. This will become increasingly important as pay-for-performance initiatives expand with healthcare reform.

  5. Combinatorial-topological framework for the analysis of global dynamics

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  6. Combinatorial-topological framework for the analysis of global dynamics.

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  7. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Bloyd, C.; Camp, J.; Conzelmann, G. [and others

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  8. A Stochastic Hybrid Systems framework for analysis of Markov reward models

    International Nuclear Information System (INIS)

    Dhople, S.V.; DeVille, L.; Domínguez-García, A.D.

    2014-01-01

    In this paper, we propose a framework to analyze Markov reward models, which are commonly used in system performability analysis. The framework builds on a set of analytical tools developed for a class of stochastic processes referred to as Stochastic Hybrid Systems (SHS). The state space of an SHS is comprised of: (i) a discrete state that describes the possible configurations/modes that a system can adopt, which includes the nominal (non-faulty) operational mode, but also those operational modes that arise due to component faults, and (ii) a continuous state that describes the reward. Discrete state transitions are stochastic, and governed by transition rates that are (in general) a function of time and the value of the continuous state. The evolution of the continuous state is described by a stochastic differential equation and reward measures are defined as functions of the continuous state. Additionally, each transition is associated with a reset map that defines the mapping between the pre- and post-transition values of the discrete and continuous states; these mappings enable the definition of impulses and losses in the reward. The proposed SHS-based framework unifies the analysis of a variety of previously studied reward models. We illustrate the application of the framework to performability analysis via analytical and numerical examples

  9. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  10. Talking Cure Models: A Framework of Analysis

    Directory of Open Access Journals (Sweden)

    Christopher Marx

    2017-09-01

    Full Text Available Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1 a foundational theory (which suggests how linguistic activity can affect and transform human experience, (2 an experiential problem state (which defines the problem or pathology of the patient, (3 a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state, and (4 a change mechanism (which defines the processes and effects involved in such transformations. The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1 catharsis, (2 symbolization, (3 narrative, (4 metaphor, and (5 neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more

  11. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  12. Evaluation of the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers: a systematic review and documentary analysis.

    Science.gov (United States)

    McGraw, Caroline; Drennan, Vari M

    2015-02-01

    To evaluate the suitability of root cause analysis frameworks for the investigation of community-acquired pressure ulcers. The objective was to identify the extent to which these frameworks take account of the setting where the ulcer originated as being the person's home rather than a hospital setting. Pressure ulcers involving full-thickness skin loss are increasingly being regarded as indicators of nursing patient safety failure, requiring investigation using root cause analysis frameworks. Evidence suggests that root cause analysis frameworks developed in hospital settings ignore the unique dimensions of risk in home healthcare settings. A systematic literature review and documentary analysis of frameworks used to investigate community-acquired grade three and four pressure ulcers by home nursing services in England. No published papers were identified for inclusion in the review. Fifteen patient safety investigative frameworks were collected and analysed. Twelve of the retrieved frameworks were intended for the investigation of community-acquired pressure ulcers; seven of which took account of the setting where the ulcer originated as being the patient's home. This study provides evidence to suggest that many of the root cause analysis frameworks used to investigate community-acquired pressure ulcers in England are unsuitable for this purpose. This study provides researchers and practitioners with evidence of the need to develop appropriate home nursing root cause analysis frameworks to investigate community-acquired pressure ulcers. © 2014 John Wiley & Sons Ltd.

  13. A simplified, result oriented supplier performance management system testing framework for SME

    Directory of Open Access Journals (Sweden)

    Satya Parkash Kaushik

    2014-06-01

    Full Text Available Background: Supplier performance management continues to be a significant concern for small & medium enterprises (SME. How can small & medium enterprises better position themselves to check and sustain actual supplier performance improvement? A key framework is the establishment of a value-added supplier performance audit program that places significant emphasis on supplier performance controls. A value-added supplier audit program can help SME mitigate business and regulatory risk while reducing the cost of poor quality (COPQ. Thus a good supplier performance audit program is the cornerstone of supplier performance management integrity. Methods: By acknowledging and addressing the challenges to an effective supplier Performance Audit program, this paper proposes an objective framework of supplier performance audit program, built on a strong, yet versatile statistical methodology - Analysis of variance (ANOVA. This performance audit framework considers process definition, standardization, review of the contemporary literature on ANOVA & its practical application in supplier performance scorecard of one of the reputed Sports Goods Industry in India. Results and conclusions: The advantages of this framework are that: it simultaneously considers multiple supplier performance in multiple time frames and effectively identifies the differences across the suppliers in terms of their performance. Through this framework the organization will be able to increase the odds of performing a predictable and successful implementation of a value-added supplier performance audit.

  14. iOS game development : Mobile game development with Swift programming language and SceneKit framework

    OpenAIRE

    Koskenseppä, Juuso

    2016-01-01

    The purpose of the thesis was to create an iOS game that could be deemed complete enough, so it could be published in Apple’s App Store. This meant fulfilling different guide-lines specified by Apple. The project was carried out by using Apple’s new Swift programming language and SceneKit framework, with an intention to see how they work for iOS game development. The immaturity of Swift programming language led to several code rewrites, every time a newer Swift version was released. T...

  15. Energy pathway analysis - a hydrogen fuel cycle framework for system studies

    International Nuclear Information System (INIS)

    Badin, J.S.; Tagore, S.

    1997-01-01

    An analytical framework has been developed that can be used to estimate a range of life-cycle costs and impacts that result from the incremental production, storage, transport, and use of different fuels or energy carriers, such as hydrogen, electricity, natural gas, and gasoline. This information is used in a comparative analysis of energy pathways. The pathways provide the U.S. Department of Energy (DOE) with an indication of near-, mid-, and long-term technologies that have the greatest potential for advancement and can meet the cost goals. The methodology and conceptual issues are discussed. Also presented are results for selected pathways from the E3 (Energy, Economics, Emissions) Pathway Analysis Model. This model will be expanded to consider networks of pathways and to be compatible with a linear programming optimization processor. Scenarios and sets of constraints (energy demands, sources, emissions) will be defined so the effects on energy transformation activities included in the solution and on the total optimized system cost can be investigated. This evaluation will be used as a guide to eliminate technically feasible pathways if they are not cost effective or do not meet the threshold requirements for the market acceptance. (Author)

  16. Policy analysis and advocacy in nursing education: the Nursing Education Council of British Columbia framework.

    Science.gov (United States)

    Duncan, Susan M; Thorne, Sally; Van Neste-Kenny, Jocelyne; Tate, Betty

    2012-05-01

    Academic nursing leaders play a crucial role in the policy context for nursing education. Effectiveness in this role requires that they work together in presenting nursing education issues from a position of strength, informed by a critical analysis of policy pertaining to the delivery of quality nursing education and scholarship. We describe a collective process of dialog and critical analysis whereby nurse leaders in one Canadian province addressed pressing policy issues facing governments, nursing programs, faculty, and students. Consensus among academic nurse leaders, formalized through the development of a policy action framework, has enabled us to take a stand, at times highly contested, in the politicized arena of the nursing shortage. We present the components of a policy action framework for nursing education and share examples of how we have used a critical approach to analyze and frame policy issues in nursing education for inclusion on policy agendas. We believe our work has influenced provincial and national thinking about policy in nursing education is the foundation of our conclusion that political presence and shared strategy among academic nursing leaders is undeniably critical in the global context of nursing today. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  17. Expenditure Analysis of HIV Testing and Counseling Services Using the Cascade Framework in Vietnam.

    Directory of Open Access Journals (Sweden)

    Van Thu Nguyen

    Full Text Available Currently, HIV testing and counseling (HTC services in Vietnam are primarily funded by international sources. However, international funders are now planning to withdraw their support and the Government of Vietnam (GVN is seeking to identify domestic funding and generate client fees to continue services. A clear understanding of the cost to sustain current HTC services is becoming increasingly important to facilitate planning that can lead to making HTC and other HIV services more affordable and sustainable in Vietnam. The objectives of this analysis were to provide a snapshot of current program costs to achieve key program outcomes including 1 testing and identifying PLHIV unaware of their HIV status and 2 successfully enrolling HIV (+ clients in care.We reviewed expenditure data reported by 34 HTC sites in nine Vietnamese provinces over a one-year period from October 2012 to September 2013. Data on program outcomes were extracted from the HTC database of 42,390 client records. Analysis was carried out from the service providers' perspective.The mean expenditure for a single client provided HTC services (testing, receiving results and referral for care/treatment was US $7.6. The unit expenditure per PLHIV identified through these services varied widely from US $22.8 to $741.5 (median: $131.8. Excluding repeat tests, the range for expenditure to newly diagnose a PLHIV was even wider (from US $30.8 to $1483.0. The mean expenditure for one successfully referred HIV client to care services was US $466.6. Personnel costs contributed most to the total cost.Our analysis found a wide range of expenditures by site for achieving the same outcomes. Re-designing systems to provide services at the lowest feasible cost is essential to making HIV services more affordable and treatment for prevention programs feasible in Vietnam. The analysis also found that understanding the determinants and reasons for variance in service costs by site is an important

  18. Measuring the performance of vaccination programs using cross-sectional surveys: a likelihood framework and retrospective analysis.

    Directory of Open Access Journals (Sweden)

    Justin Lessler

    2011-10-01

    Full Text Available The performance of routine and supplemental immunization activities is usually measured by the administrative method: dividing the number of doses distributed by the size of the target population. This method leads to coverage estimates that are sometimes impossible (e.g., vaccination of 102% of the target population, and are generally inconsistent with the proportion found to be vaccinated in Demographic and Health Surveys (DHS. We describe a method that estimates the fraction of the population accessible to vaccination activities, as well as within-campaign inefficiencies, thus providing a consistent estimate of vaccination coverage.We developed a likelihood framework for estimating the effective coverage of vaccination programs using cross-sectional surveys of vaccine coverage combined with administrative data. We applied our method to measles vaccination in three African countries: Ghana, Madagascar, and Sierra Leone, using data from each country's most recent DHS survey and administrative coverage data reported to the World Health Organization. We estimate that 93% (95% CI: 91, 94 of the population in Ghana was ever covered by any measles vaccination activity, 77% (95% CI: 78, 81 in Madagascar, and 69% (95% CI: 67, 70 in Sierra Leone. "Within-activity" inefficiencies were estimated to be low in Ghana, and higher in Sierra Leone and Madagascar. Our model successfully fits age-specific vaccination coverage levels seen in DHS data, which differ markedly from those predicted by naïve extrapolation from country-reported and World Health Organization-adjusted vaccination coverage.Combining administrative data with survey data substantially improves estimates of vaccination coverage. Estimates of the inefficiency of past vaccination activities and the proportion not covered by any activity allow us to more accurately predict the results of future activities and provide insight into the ways in which vaccination programs are failing to meet their

  19. Using a Mixed-Methods RE-AIM Framework to Evaluate Community Health Programs for Older Latinas.

    Science.gov (United States)

    Schwingel, Andiara; Gálvez, Patricia; Linares, Deborah; Sebastião, Emerson

    2017-06-01

    This study used the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework to evaluate a promotora-led community health program designed for Latinas ages 50 and older that sought to improve physical activity, nutrition, and stress management. A mixed-methods evaluation approach was administered at participant and organizational levels with a focus on the efficacy, adoption, implementation, and maintenance components of the RE-AIM theoretical model. The program was shown to be effective at improving participants' eating behaviors, increasing their physical activity levels, and lowering their depressive symptoms. Promotoras felt motivated and sufficiently prepared to deliver the program. Some implementation challenges were reported. More child care opportunities and an increased focus on mental well-being were suggested. The promotora delivery model has promise for program sustainability with both promotoras and participants alike expressing interest in leading future programs.

  20. An in-depth analysis of theoretical frameworks for the study of care coordination

    Directory of Open Access Journals (Sweden)

    Sabine Van Houdt

    2013-06-01

    Full Text Available Introduction: Complex chronic conditions often require long-term care from various healthcare professionals. Thus, maintaining quality care requires care coordination. Concepts for the study of care coordination require clarification to develop, study and evaluate coordination strategies. In 2007, the Agency for Healthcare Research and Quality defined care coordination and proposed five theoretical frameworks for exploring care coordination. This study aimed to update current theoretical frameworks and clarify key concepts related to care coordination. Methods: We performed a literature review to update existing theoretical frameworks. An in-depth analysis of these theoretical frameworks was conducted to formulate key concepts related to care coordination.Results: Our literature review found seven previously unidentified theoretical frameworks for studying care coordination. The in-depth analysis identified fourteen key concepts that the theoretical frameworks addressed. These were ‘external factors’, ‘structure’, ‘tasks characteristics’, ‘cultural factors’, ‘knowledge and technology’, ‘need for coordination’, ‘administrative operational processes’, ‘exchange of information’, ‘goals’, ‘roles’, ‘quality of relationship’, ‘patient outcome’, ‘team outcome’, and ‘(interorganizational outcome’.Conclusion: These 14 interrelated key concepts provide a base to develop or choose a framework for studying care coordination. The relational coordination theory and the multi-level framework are interesting as these are the most comprehensive.

  1. The Soldier-Cyborg Transformation: A Framework for Analysis of Social and Ethical Issues of Future Warfare

    Science.gov (United States)

    1998-05-26

    government agency. STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...UNCLASSIFIED USAWC STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...P) Donald A. Gagliano, M.D. TITLE: THE SOLDIER CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE WARFARE

  2. iOS Game Development using SpriteKit Framework with Swift Programming Language

    OpenAIRE

    Gurung, Lal

    2016-01-01

    iOS is a mobile operating system for Apple manufactured phones and tablets. Mobile Gaming Industries are growing very fast, and compatibility with iOS is becoming very popular among game developers. The aim of this Bachelor’s thesis was to find the best available game development tools for iOS platform. The 2D game named Lapland was developed using Apple’s own native framework, SpriteKit. The game was written with the SpriteKit programming language. The combination of SpriteKit and Swift...

  3. An Analysis of Massachusetts Department of Elementary and Secondary Education Vocational Technical Education Framework for Culinary Arts and Its Effectiveness on Students Enrolled in Post-Secondary Culinary Programs

    Science.gov (United States)

    D'Addario, Albert S.

    2011-01-01

    This field-based action research practicum investigated how students who have completed culinary training programs in Massachusetts public secondary schools perform in post-secondary coursework. The Department of Elementary and Secondary Education has developed the Vocational Technical Education (VTE) Framework for Culinary Arts that outlines…

  4. A benchmarking program to reduce red blood cell outdating: implementation, evaluation, and a conceptual framework.

    Science.gov (United States)

    Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M

    2015-07-01

    Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.

  5. Analysis of Worldwide Regulatory Framework for On-Line Maintenance

    International Nuclear Information System (INIS)

    Ahn, Sang Kyu; Oh, Kyu Myung; Lee, Chang Ju

    2010-01-01

    With the increasing economic pressures being faced and the potential for shortening outage times under the conditions of deregulated electricity markets in the world, licensees are motivated to get an increasing amount of online maintenance (OLM). OLM means a kind of planned maintenance of nuclear reactor facilities, including structure, systems, and components (SSCs), during power operation. In Korea, a similar situation is made up, so it needs to establish a regulatory framework for OLM. A few years ago, foreign countries' practices related to OLM were surveyed by the Working Group on Inspection Practices (WGIP) of OECD/NEA/CNRA. The survey results and additional new information of countries' status will be helpful to establish our own regulatory framework for OLM, which are analyzed in this paper. From the analysis, some considerable points to be addressed for establishing a regulatory framework for OLM are suggested

  6. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    Science.gov (United States)

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  7. Structural Analysis in a Conceptual Design Framework

    Science.gov (United States)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  8. A Practical Framework for Evaluating Health Services Management Educational Program: The Application of The Mixed-Method Sequential Explanatory Design

    Directory of Open Access Journals (Sweden)

    Bazrafshan Azam

    2015-07-01

    Full Text Available Introduction:Health services managers are responsible for improving the efficiency and quality in delivering healthcare services. In this regard, Health Services Management (HSM programs have been widely established to provide health providers with skilled, professional managers to address those needs. It is therefore important to ascertain the quality of these programs. The purpose of this study was to synthesize and develop a framework to evaluate the quality of the Health Services Management (HSM program at Kerman University of Medical Sciences. Methods: This study followed a mixed-method sequential explanatory approach in which data were collected through a CIPP survey and semi-structured interviews. In phase 1, participants included 10 faculty members, 64 students and 90 alumni. In phase 2, in-depth semi-structured interviews and purposeful sampling were conducted with 27 participants to better understand their perceptions of the HSM program. All interviews were audio-taped and transcribed verbatim. NVivo N8 was used to analyze the qualitative data and extract the themes. Results: The data analysis revealed both positive and negative attitudes toward the HSM program. According to the CIPP survey, program objectives (74%, curriculum content (59.5% and graduate skills (79% were the major sources of dissatisfaction. However, most respondents (n=48 reported that the classes are well equipped and learning resources are well prepared (n=41. Most respondents (n=41 reported that the students are actively involved in classroom activities. The majority of respondents (n=43 pointed out that the instructors implemented appropriate teaching strategies. Qualitative analysis of interviews revealed that a regular community needs assessment, content revision and directing attention to graduate skills and expertise are the key solutions to improve the program’s quality.Conclusion: This study revealed to what extent the HSM program objectives is being

  9. Deterministic Design Optimization of Structures in OpenMDAO Framework

    Science.gov (United States)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  10. The Tracking and Analysis Framework (TAF): A tool for the integrated assessment of acid deposition

    International Nuclear Information System (INIS)

    Bloyd, C.N.; Henrion, M.; Marnicio, R.J.

    1995-01-01

    A major challenge that has faced policy makers concerned with acid deposition is obtaining an integrated view of the underlying science related to acid deposition. In response to this challenge, the US Department of Energy is sponsoring the development of an integrated Tracking and Analysis Framework (TAF) which links together the key acid deposition components of emissions, air transport, atmospheric deposition, and aquatic effects in a single modeling structure. The goal of TAF is to integrate credible models of the scientific and technical issues into an assessment framework that can directly address key policy issues, and in doing so act as a bridge between science and policy. Key objectives of TAF are to support coordination and communication among scientific researchers; to support communications with policy makers, and to provide rapid response for analyzing newly emerging policy issues; and to provide guidance for prioritizing research programs. This paper briefly describes how TAF was formulated to meet those objectives and the underlying principals which form the basis for its development

  11. C++QEDv2 Milestone 10: A C++/Python application-programming framework for simulating open quantum dynamics

    Science.gov (United States)

    Sandner, Raimar; Vukics, András

    2014-09-01

    The v2 Milestone 10 release of C++QED is primarily a feature release, which also corrects some problems of the previous release, especially as regards the build system. The adoption of C++11 features has led to many simplifications in the codebase. A full doxygen-based API manual [1] is now provided together with updated user guides. A largely automated, versatile new testsuite directed both towards computational and physics features allows for quickly spotting arising errors. The states of trajectories are now savable and recoverable with full binary precision, allowing for trajectory continuation regardless of evolution method (single/ensemble Monte Carlo wave-function or Master equation trajectory). As the main new feature, the framework now presents Python bindings to the highest-level programming interface, so that actual simulations for given composite quantum systems can now be performed from Python. Catalogue identifier: AELU_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELU_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 492422 No. of bytes in distributed program, including test data, etc.: 8070987 Distribution format: tar.gz Programming language: C++/Python. Computer: i386-i686, x86 64. Operating system: In principle cross-platform, as yet tested only on UNIX-like systems (including Mac OS X). RAM: The framework itself takes about 60MB, which is fully shared. The additional memory taken by the program which defines the actual physical system (script) is typically less than 1MB. The memory storing the actual data scales with the system dimension for state-vector manipulations, and the square of the dimension for density-operator manipulations. This might easily be GBs, and often the memory of the machine limits the size of the simulated system. Classification: 4.3, 4.13, 6.2. External routines: Boost C

  12. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  13. Framework for sequential approximate optimization

    NARCIS (Netherlands)

    Jacobs, J.H.; Etman, L.F.P.; Keulen, van F.; Rooda, J.E.

    2004-01-01

    An object-oriented framework for Sequential Approximate Optimization (SAO) isproposed. The framework aims to provide an open environment for thespecification and implementation of SAO strategies. The framework is based onthe Python programming language and contains a toolbox of Python

  14. Metacognition and evidence analysis instruction: an educational framework and practical experience.

    Science.gov (United States)

    Parrott, J Scott; Rubinstein, Matthew L

    2015-08-21

    The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught. The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.

  15. A planning framework for transferring building energy technologies

    Energy Technology Data Exchange (ETDEWEB)

    Farhar, B C; Brown, M A; Mohler, B L; Wilde, M; Abel, F H

    1990-07-01

    Accelerating the adoption of new and existing cost-effective technologies has significant potential to reduce the energy consumed in US buildings. This report presents key results of an interlaboratory technology transfer planning effort in support of the US Department of Energy's Office of Building Technologies (OBT). A guiding assumption for planning was that OBT's R D program should forge linkages with existing programs whose goals involved enhancing energy efficiency in buildings. An ad hoc Technology Transfer Advisory Group reviewed the existing analysis and technology transfer program, brainstormed technology transfer approaches, interviewed DOE program managers, identified applicable research results, and developed a framework that management could use in deciding on the best investments of technology transfer resources. Representatives of 22 organizations were interviewed on their views of the potential for transferring energy efficiency technologies through active linking with OBT. The report describes these programs and interview results; outlines OBT tools, technologies, and practices to be transferred; defines OBT audiences; identifies technology transfer functions and presents a framework devised using functions and audiences; presents some 60 example technology transfer activities; and documents the Advisory Group's recommendations. 37 refs., 3 figs., 12 tabs.

  16. 77 FR 11785 - Energy Conservation Program: Public Meeting and Availability of the Framework Document for High...

    Science.gov (United States)

    2012-02-28

    ... standards for high-intensity discharge (HID) lamps. Accordingly, DOE will hold a public meeting to discuss..._standards/commercial/high_intensity_discharge_lamps.html . DATES: The Department will hold a public meeting... Technologies Program, Mailstop EE-2J, Framework Document for High-Intensity Discharge Lamps, EERE-2010-BT-STD...

  17. Data collection and analysis to improve the quality and effectiveness of recycling education programs

    Energy Technology Data Exchange (ETDEWEB)

    Shapek, Raymond A [Department of Public Administration, University of Central Florida, Orlando, FL (United States)

    1993-10-01

    Although recycling participation rates and the success of recycling programs is determined by a multitude of social, economic and political factors, community participation is paramount in determining if recycling programs will accomplish their objectives. Research has tended to focus on the scientific aspects of waste reduction and management, but new concerns are about cost effectiveness factors. A considerable amount of money has been spent on recycling education/information programs with little or no measurement of the effects or results of these expenditures. This article reports the results of a survey of Florida's 67 counties to determine whether advertising media choices for recycling education/information programs were related to recycling rates. A mathematical model was developed which indicated correlations as well as predictability. Florida's recycling information/education effort is still new and does not yet provide a sufficient historical record of trends. This research provided some trend information through regression analysis techniques, but more importantly, suggests a framework for future analysis. It also revealed the deficiencies in current county and state data collection methods. Some of the lessons learned will permit a more accurate charting of the long-term results of dollar expenditures on media advertising for each county

  18. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  19. A new kernel discriminant analysis framework for electronic nose recognition

    International Nuclear Information System (INIS)

    Zhang, Lei; Tian, Feng-Chun

    2014-01-01

    Graphical abstract: - Highlights: • This paper proposes a new discriminant analysis framework for feature extraction and recognition. • The principle of the proposed NDA is derived mathematically. • The NDA framework is coupled with kernel PCA for classification. • The proposed KNDA is compared with state of the art e-Nose recognition methods. • The proposed KNDA shows the best performance in e-Nose experiments. - Abstract: Electronic nose (e-Nose) technology based on metal oxide semiconductor gas sensor array is widely studied for detection of gas components. This paper proposes a new discriminant analysis framework (NDA) for dimension reduction and e-Nose recognition. In a NDA, the between-class and the within-class Laplacian scatter matrix are designed from sample to sample, respectively, to characterize the between-class separability and the within-class compactness by seeking for discriminant matrix to simultaneously maximize the between-class Laplacian scatter and minimize the within-class Laplacian scatter. In terms of the linear separability in high dimensional kernel mapping space and the dimension reduction of principal component analysis (PCA), an effective kernel PCA plus NDA method (KNDA) is proposed for rapid detection of gas mixture components by an e-Nose. The NDA framework is derived in this paper as well as the specific implementations of the proposed KNDA method in training and recognition process. The KNDA is examined on the e-Nose datasets of six kinds of gas components, and compared with state of the art e-Nose classification methods. Experimental results demonstrate that the proposed KNDA method shows the best performance with average recognition rate and total recognition rate as 94.14% and 95.06% which leads to a promising feature extraction and multi-class recognition in e-Nose

  20. A Framework-Based Environment for Object-Oriented Scientific Codes

    Directory of Open Access Journals (Sweden)

    Robert A. Ballance

    1993-01-01

    Full Text Available Frameworks are reusable object-oriented designs for domain-specific programs. In our estimation, frameworks are the key to productivity and reuse. However, frameworks require increased support from the programming environment. A framework-based environment must include design aides and project browsers that can mediate between the user and the framework. A framework-based approach also places new requirements on conventional tools such as compilers. This article explores the impact of object-oriented frameworks upon a programming environment, in the context of object-oriented finite element and finite difference codes. The role of tools such as design aides and project browsers is discussed, and the impact of a framework-based approach upon compilers is examined. Examples are drawn from our prototype C++ based environment.

  1. An analysis of a national strategic framework to promote tourism ...

    African Journals Online (AJOL)

    An analysis of a national strategic framework to promote tourism, leisure, sport and ... is to highlight the extent to which selected macro policy components namely, ... tourism growth, tourism safety and security, environmental management and ...

  2. A framework for smartphone-enabled, patient-generated health data analysis

    Directory of Open Access Journals (Sweden)

    Shreya S. Gollamudi

    2016-08-01

    Full Text Available Background: Digital medicine and smartphone-enabled health technologies provide a novel source of human health and human biology data. However, in part due to its intricacies, few methods have been established to analyze and interpret data in this domain. We previously conducted a six-month interventional trial examining the efficacy of a comprehensive smartphone-based health monitoring program for individuals with chronic disease. This included 38 individuals with hypertension who recorded 6,290 blood pressure readings over the trial. Methods: In the present study, we provide a hypothesis testing framework for unstructured time series data, typical of patient-generated mobile device data. We used a mixed model approach for unequally spaced repeated measures using autoregressive and generalized autoregressive models, and applied this to the blood pressure data generated in this trial. Results: We were able to detect, roughly, a 2 mmHg decrease in both systolic and diastolic blood pressure over the course of the trial despite considerable intra- and inter-individual variation. Furthermore, by supplementing this finding by using a sequential analysis approach, we observed this result over three months prior to the official study end—highlighting the effectiveness of leveraging the digital nature of this data source to form timely conclusions. Conclusions: Health data generated through the use of smartphones and other mobile devices allow individuals the opportunity to make informed health decisions, and provide researchers the opportunity to address innovative health and biology questions. The hypothesis testing framework we present can be applied in future studies utilizing digital medicine technology or implemented in the technology itself to support the quantified self.

  3. Design and implementation of the reconstruction software for the photon multiplicity detector in object oriented programming framework

    International Nuclear Information System (INIS)

    Chattopadhayay, Subhasis; Ghosh, Premomoy; Gupta, R.; Mishra, D.; Phatak, S.C.; Sood, G.

    2002-01-01

    High granularity photon multiplicity detector (PMD) is scheduled to take data in Relativistic Heavy Ion Collision(RHIC) this year. A detailed scheme has been designed and implemented in object oriented programming framework using C++ for the monitoring and reconstruction job of PMD data

  4. A framework for the economic analysis of data collection methods for vital statistics.

    Science.gov (United States)

    Jimenez-Soto, Eliana; Hodge, Andrew; Nguyen, Kim-Huong; Dettrick, Zoe; Lopez, Alan D

    2014-01-01

    Over recent years there has been a strong movement towards the improvement of vital statistics and other types of health data that inform evidence-based policies. Collecting such data is not cost free. To date there is no systematic framework to guide investment decisions on methods of data collection for vital statistics or health information in general. We developed a framework to systematically assess the comparative costs and outcomes/benefits of the various data methods for collecting vital statistics. The proposed framework is four-pronged and utilises two major economic approaches to systematically assess the available data collection methods: cost-effectiveness analysis and efficiency analysis. We built a stylised example of a hypothetical low-income country to perform a simulation exercise in order to illustrate an application of the framework. Using simulated data, the results from the stylised example show that the rankings of the data collection methods are not affected by the use of either cost-effectiveness or efficiency analysis. However, the rankings are affected by how quantities are measured. There have been several calls for global improvements in collecting useable data, including vital statistics, from health information systems to inform public health policies. Ours is the first study that proposes a systematic framework to assist countries undertake an economic evaluation of DCMs. Despite numerous challenges, we demonstrate that a systematic assessment of outputs and costs of DCMs is not only necessary, but also feasible. The proposed framework is general enough to be easily extended to other areas of health information.

  5. FIND--a unified framework for neural data analysis.

    Science.gov (United States)

    Meier, Ralph; Egert, Ulrich; Aertsen, Ad; Nawrot, Martin P

    2008-10-01

    The complexity of neurophysiology data has increased tremendously over the last years, especially due to the widespread availability of multi-channel recording techniques. With adequate computing power the current limit for computational neuroscience is the effort and time it takes for scientists to translate their ideas into working code. Advanced analysis methods are complex and often lack reproducibility on the basis of published descriptions. To overcome this limitation we develop FIND (Finding Information in Neural Data) as a platform-independent, open source framework for the analysis of neuronal activity data based on Matlab (Mathworks). Here, we outline the structure of the FIND framework and describe its functionality, our measures of quality control, and the policies for developers and users. Within FIND we have developed a unified data import from various proprietary formats, simplifying standardized interfacing with tools for analysis and simulation. The toolbox FIND covers a steadily increasing number of tools. These analysis tools address various types of neural activity data, including discrete series of spike events, continuous time series and imaging data. Additionally, the toolbox provides solutions for the simulation of parallel stochastic point processes to model multi-channel spiking activity. We illustrate two examples of complex analyses with FIND tools: First, we present a time-resolved characterization of the spiking irregularity in an in vivo extracellular recording from a mushroom-body extrinsic neuron in the honeybee during odor stimulation. Second, we describe layer specific input dynamics in the rat primary visual cortex in vivo in response to visual flash stimulation on the basis of multi-channel spiking activity.

  6. Legal framework for a nuclear program

    International Nuclear Information System (INIS)

    Santos, A. de los; Corretjer, L.

    1977-01-01

    Introduction of a nuclear program requires the establishment of an adequate legal framework as solutions to the problems posed by the use of nuclear energy are not included in Common Law. As far as Spain is concerned, legislation is capable of dealing with the main problems posed in this field. Spain is a Contracting Party in several International Conventions and participates in International Organizations related to this area and takes their recommendations into account when revising its national legislation. Specific Spanish legislation is constituted by Law 25/1964, of April 29th, on Nuclear Energy, which outlines the legal system regarding nuclear energy, and regulates all aspects which refer to same, from the competent organisms and authorities to the sanctions to be imposed for non-fulfilment of the provisions. In order to offer sufficient flexibility, so that it can be adapted to specific circumstances, the Law's provisions are very ample and development is foreseen by means of regulations. So far, two Regulations have been published: Regulation relating to Coverage of Risk of Nuclear Damage, which refers to Civil Responsibility and its Coverage; and Regulation relating to Nuclear and Radioactive Installations, which refers to the authorization and license system. At the present time, the Regulation relating to Radiation Protection is being elaborated and it will replace the present Radiation Protection Ordinances. In addition to the foregoing, reference is made to others which, although they are not specifically ''nuclear'', they include precepts related to this question, such as the Regulation regarding Nuisance, Unhealthy or Dangerous Industries or some Labor Law provisions [es

  7. An intersectionality-based policy analysis framework: critical reflections on a methodology for advancing equity.

    Science.gov (United States)

    Hankivsky, Olena; Grace, Daniel; Hunting, Gemma; Giesbrecht, Melissa; Fridkin, Alycia; Rudrum, Sarah; Ferlatte, Olivier; Clark, Natalie

    2014-12-10

    In the field of health, numerous frameworks have emerged that advance understandings of the differential impacts of health policies to produce inclusive and socially just health outcomes. In this paper, we present the development of an important contribution to these efforts - an Intersectionality-Based Policy Analysis (IBPA) Framework. Developed over the course of two years in consultation with key stakeholders and drawing on best and promising practices of other equity-informed approaches, this participatory and iterative IBPA Framework provides guidance and direction for researchers, civil society, public health professionals and policy actors seeking to address the challenges of health inequities across diverse populations. Importantly, we present the application of the IBPA Framework in seven priority health-related policy case studies. The analysis of each case study is focused on explaining how IBPA: 1) provides an innovative structure for critical policy analysis; 2) captures the different dimensions of policy contexts including history, politics, everyday lived experiences, diverse knowledges and intersecting social locations; and 3) generates transformative insights, knowledge, policy solutions and actions that cannot be gleaned from other equity-focused policy frameworks. The aim of this paper is to inspire a range of policy actors to recognize the potential of IBPA to foreground the complex contexts of health and social problems, and ultimately to transform how policy analysis is undertaken.

  8. A Framework for Assessment of Aviation Safety Technology Portfolios

    Science.gov (United States)

    Jones, Sharon M.; Reveley, Mary S.

    2014-01-01

    The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.

  9. A proposed benefits evaluation framework for health information systems in Canada.

    Science.gov (United States)

    Lau, Francis; Hagens, Simon; Muttitt, Sarah

    2007-01-01

    This article describes a benefits evaluation framework for the health information systems currently being implemented across Canada through Canada Health Infoway with its jurisdictional partners and investment programs. This framework is based on the information systems success model by DeLone and McLean, the empirical analysis by van der Meijden on the use of this model in the health setting and our own review of evaluation studies and systematic review articles in health information systems. The current framework includes three dimensions of quality (system, information and service), two dimensions of system usage (use and user satisfaction) and three dimensions of net benefits (quality, access and productivity). Measures have been developed and work is under way to establish detailed evaluation plans and instruments for the individual investment programs to launch a series of benefits evaluation field studies across jurisdictions later this year.

  10. Framework for Interactive Parallel Dataset Analysis on the Grid

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, David A.; Ananthan, Balamurali; /Tech-X Corp.; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  11. Interactive Safety Analysis Framework of Autonomous Intelligent Vehicles

    Directory of Open Access Journals (Sweden)

    Cui You Xiang

    2016-01-01

    Full Text Available More than 100,000 people were killed and around 2.6 million injured in road accidents in the People’s Republic of China (PRC, that is four to eight times that of developed countries, equivalent to 6.2 mortality per 10 thousand vehicles—the highest rate in the world. There are more than 1,700 fatalities and 840,000 injuries yearly due to vehicle crashes off public highways. In this paper, we proposed a interactive safety situation and threat analysis framework based on driver behaviour and vehicle dynamics risk analysis based on ISO26262…

  12. A decision framework for coordinating bioterrorism planning: lessons from the BioNet program.

    Science.gov (United States)

    Manley, Dawn K; Bravata, Dena M

    2009-01-01

    Effective disaster preparedness requires coordination across multiple organizations. This article describes a detailed framework developed through the BioNet program to facilitate coordination of bioterrorism preparedness planning among military and civilian decision makers. The authors and colleagues conducted a series of semistructured interviews with civilian and military decision makers from public health, emergency management, hazardous material response, law enforcement, and military health in the San Diego area. Decision makers used a software tool that simulated a hypothetical anthrax attack, which allowed them to assess the effects of a variety of response actions (eg, issuing warnings to the public, establishing prophylaxis distribution centers) on performance metrics. From these interviews, the authors characterized the information sources, technologies, plans, and communication channels that would be used for bioterrorism planning and responses. The authors used influence diagram notation to describe the key bioterrorism response decisions, the probabilistic factors affecting these decisions, and the response outcomes. The authors present an overview of the response framework and provide a detailed assessment of two key phases of the decision-making process: (1) pre-event planning and investment and (2) incident characterization and initial responsive measures. The framework enables planners to articulate current conditions; identify gaps in existing policies, technologies, information resources, and relationships with other response organizations; and explore the implications of potential system enhancements. Use of this framework could help decision makers execute a locally coordinated response by identifying the critical cues of a potential bioterrorism event, the information needed to make effective response decisions, and the potential effects of various decision alternatives.

  13. A Framework for Professional Ethics Courses in Teacher Education

    Science.gov (United States)

    Warnick, Bryan R.; Silverman, Sarah K.

    2011-01-01

    Evidence suggests that professional ethics is currently a neglected topic in teacher education programs. In this article, the authors revisit the question of ethics education for teachers. The authors propose an approach to the professional ethics of teaching that employs a case-analysis framework specifically tailored to address the practice of…

  14. A threat analysis framework as applied to critical infrastructures in the Energy Sector.

    Energy Technology Data Exchange (ETDEWEB)

    Michalski, John T.; Duggan, David Patrick

    2007-09-01

    The need to protect national critical infrastructure has led to the development of a threat analysis framework. The threat analysis framework can be used to identify the elements required to quantify threats against critical infrastructure assets and provide a means of distributing actionable threat information to critical infrastructure entities for the protection of infrastructure assets. This document identifies and describes five key elements needed to perform a comprehensive analysis of threat: the identification of an adversary, the development of generic threat profiles, the identification of generic attack paths, the discovery of adversary intent, and the identification of mitigation strategies.

  15. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  16. Object Persistence: A Framework Based On Design Patterns

    OpenAIRE

    Kienzle, Jörg; Romanovsky, Alexander

    2000-01-01

    The poster presents a framework for providing object persistence in object-oriented programming languages without modifying the run-time system or the language itself. The framework does not rely on any kind of special programming language features. It only uses basic object-oriented programming techniques, and is therefore implementable in any object-oriented programming language.

  17. The Measurand Framework: Scaling Exploratory Data Analysis

    Science.gov (United States)

    Schneider, D.; MacLean, L. S.; Kappler, K. N.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired a unique dataset with outstanding spatial and temporal sampling of earth's time varying magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. In order to analyze this sizable dataset, QF has developed an analytical framework to support processing the time series input data and hypothesis testing to evaluate the statistical significance of potential precursory signals. The framework was developed with a need to support legacy, in-house processing but with an eye towards big-data processing with Apache Spark and other modern big data technologies. In this presentation, we describe our framework, which supports rapid experimentation and iteration of candidate signal processing techniques via modular data transformation stages, tracking of provenance, and automatic re-computation of downstream data when upstream data is updated. Furthermore, we discuss how the processing modules can be ported to big data platforms like Apache Spark and demonstrate a migration path from local, in-house processing to cloud-friendly processing.

  18. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe; Dalcin, Lisandro; Collier, Nathan; Calo, Victor M.

    2014-01-01

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation

  19. Economic and Nonproliferation Analysis Framework for Assessing Reliable Nuclear Fuel Service Arrangements

    International Nuclear Information System (INIS)

    Phillips, Jon R.; Kreyling, Sean J.; Short, Steven M.; Weimar, Mark R.

    2010-01-01

    Nuclear power is now broadly recognized as an essential technology in national strategies to provide energy security while meeting carbon management goals. Yet a long standing conundrum remains: how to enable rapid growth in the global nuclear power infrastructure while controlling the spread of sensitive enrichment and reprocessing technologies that lie at the heart of nuclear fuel supply and nuclear weapons programs. Reducing the latent proliferation risk posed by a broader horizontal spread of enrichment and reprocessing technology has been a primary goal of national nuclear supplier policies since the beginning of the nuclear power age. Attempts to control the spread of sensitive nuclear technology have been the subject of numerous initiatives in the intervening decades sometimes taking the form of calls to develop fuel supply and service assurances to reduce market pull to increase the number of states with fuel cycle capabilities. A clear understanding of what characteristics of specific reliable nuclear fuel service (RNFS) and supply arrangements qualify them as 'attractive offers' is critical to the success of current and future efforts. At a minimum, RNFS arrangements should provide economic value to all participants and help reduce latent proliferation risks posed by the global expansion of nuclear power. In order to inform the technical debate and the development of policy, Pacific Northwest National Laboratory has been developing an analytical framework to evaluate the economics and nonproliferation merits of alternative approaches to RNFS arrangements. This paper provides a brief overview of the economic analysis framework developed and applied to a model problem of current interest: full-service nuclear fuel leasing arrangements. Furthermore, this paper presents an extended outline of a proposed analysis approach to evaluate the non-proliferation merits of various RNFS alternatives.

  20. Introduction of new technologies and decision making processes: a framework to adapt a Local Health Technology Decision Support Program for other local settings.

    Science.gov (United States)

    Poulin, Paule; Austen, Lea; Scott, Catherine M; Poulin, Michelle; Gall, Nadine; Seidel, Judy; Lafrenière, René

    2013-01-01

    Introducing new health technologies, including medical devices, into a local setting in a safe, effective, and transparent manner is a complex process, involving many disciplines and players within an organization. Decision making should be systematic, consistent, and transparent. It should involve translating and integrating scientific evidence, such as health technology assessment (HTA) reports, with context-sensitive evidence to develop recommendations on whether and under what conditions a new technology will be introduced. However, the development of a program to support such decision making can require considerable time and resources. An alternative is to adapt a preexisting program to the new setting. We describe a framework for adapting the Local HTA Decision Support Program, originally developed by the Department of Surgery and Surgical Services (Calgary, AB, Canada), for use by other departments. The framework consists of six steps: 1) development of a program review and adaptation manual, 2) education and readiness assessment of interested departments, 3) evaluation of the program by individual departments, 4) joint evaluation via retreats, 5) synthesis of feedback and program revision, and 6) evaluation of the adaptation process. Nine departments revised the Local HTA Decision Support Program and expressed strong satisfaction with the adaptation process. Key elements for success were identified. Adaptation of a preexisting program may reduce duplication of effort, save resources, raise the health care providers' awareness of HTA, and foster constructive stakeholder engagement, which enhances the legitimacy of evidence-informed recommendations for introducing new health technologies. We encourage others to use this framework for program adaptation and to report their experiences.

  1. Project Assessment Framework through Design (PAFTD) - A Project Assessment Framework in Support of Strategic Decision Making

    Science.gov (United States)

    Depenbrock, Brett T.; Balint, Tibor S.; Sheehy, Jeffrey A.

    2014-01-01

    Research and development organizations that push the innovation edge of technology frequently encounter challenges when attempting to identify an investment strategy and to accurately forecast the cost and schedule performance of selected projects. Fast moving and complex environments require managers to quickly analyze and diagnose the value of returns on investment versus allocated resources. Our Project Assessment Framework through Design (PAFTD) tool facilitates decision making for NASA senior leadership to enable more strategic and consistent technology development investment analysis, beginning at implementation and continuing through the project life cycle. The framework takes an integrated approach by leveraging design principles of useability, feasibility, and viability and aligns them with methods employed by NASA's Independent Program Assessment Office for project performance assessment. The need exists to periodically revisit the justification and prioritization of technology development investments as changes occur over project life cycles. The framework informs management rapidly and comprehensively about diagnosed internal and external root causes of project performance.

  2. Using Campinha-Bacote's Framework to Examine Cultural Competence from an Interdisciplinary International Service Learning Program

    Science.gov (United States)

    Wall-Bassett, Elizabeth DeVane; Hegde, Archana Vasudeva; Craft, Katelyn; Oberlin, Amber Louise

    2018-01-01

    The purpose of this study was to investigate an interdisciplinary international service learning program and its impact on student sense of cultural awareness and competence using the Campinha-Bacote's (2002) framework of cultural competency model. Seven undergraduate and one graduate student from Human Development and Nutrition Science…

  3. PageRank, HITS and a unified framework for link analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst

    2001-10-01

    Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.

  4. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  5. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    International Nuclear Information System (INIS)

    Alioli, Simone; Nason, Paolo; Oleari, Carlo; Re, Emanuele

    2010-02-01

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  6. Evaluation Framework for NASA's Educational Outreach Programs

    Science.gov (United States)

    Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie

    1999-01-01

    The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.

  7. Agricultural Value Chains in Developing Countries; a Framework for Analysis

    NARCIS (Netherlands)

    Trienekens, J.H.

    2011-01-01

    The paper presents a framework for developing country value chain analysis made up of three components. The first consists of identifying major constraints for value chain upgrading: market access restrictions, weak infrastructures, lacking resources and institutional voids. In the second component

  8. The PandaRoot framework for simulation, reconstruction and analysis

    International Nuclear Information System (INIS)

    Spataro, Stefano

    2011-01-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  9. Leverage hadoop framework for large scale clinical informatics applications.

    Science.gov (United States)

    Dong, Xiao; Bahroos, Neil; Sadhu, Eugene; Jackson, Tommie; Chukhman, Morris; Johnson, Robert; Boyd, Andrew; Hynes, Denise

    2013-01-01

    In this manuscript, we present our experiences using the Apache Hadoop framework for high data volume and computationally intensive applications, and discuss some best practice guidelines in a clinical informatics setting. There are three main aspects in our approach: (a) process and integrate diverse, heterogeneous data sources using standard Hadoop programming tools and customized MapReduce programs; (b) after fine-grained aggregate results are obtained, perform data analysis using the Mahout data mining library; (c) leverage the column oriented features in HBase for patient centric modeling and complex temporal reasoning. This framework provides a scalable solution to meet the rapidly increasing, imperative "Big Data" needs of clinical and translational research. The intrinsic advantage of fault tolerance, high availability and scalability of Hadoop platform makes these applications readily deployable at the enterprise level cluster environment.

  10. THE ANALYSIS DEVELOPMENT PROGRAM OF CORPORATE SOCIAL RESPONSIBILITY AT NATIONAL PARK MANAGEMENT CIANJUR NATIONAL PARK OF MOUNT GEDE PANGRANGO

    Directory of Open Access Journals (Sweden)

    Tun Susdiyanti

    2017-04-01

    Full Text Available This study aims to analyze the development of Corporate Social Responsibility (CSR programs based on field observations and recommend appropriate strategies in implementing CSR in the National Park Management ( PTN Cianjur Gunung Gede Pangrango National Park. Working methods in this study include the evaluation stage uses a conceptual framework for descriptive analysis and recommendations on technical and drafting stage strategy using SWOT analysis. SWOT analysis, CSR program in Cianjur PTN is aggressive ( points 2.22; 1.74 is a strategic position. Proposed development strategy that can be implemented that increase the public's understanding, increase community participation, the optimization of the use of funds, and improve the performance extension, Polhut, PEH and operators in the implementation of CSR activities.

  11. A Framework for the Game-theoretic Analysis of Censorship Resistance

    Directory of Open Access Journals (Sweden)

    Elahi Tariq

    2016-10-01

    Full Text Available We present a game-theoretic analysis of optimal solutions for interactions between censors and censorship resistance systems (CRSs by focusing on the data channel used by the CRS to smuggle clients’ data past the censors. This analysis leverages the inherent errors (false positives and negatives made by the censor when trying to classify traffic as either non-circumvention traffic or as CRS traffic, as well as the underlying rate of CRS traffic. We identify Nash equilibrium solutions for several simple censorship scenarios and then extend those findings to more complex scenarios where we find that the deployment of a censorship apparatus does not qualitatively change the equilibrium solutions, but rather only affects the amount of traffic a CRS can support before being blocked. By leveraging these findings, we describe a general framework for exploring and identifying optimal strategies for the censorship circumventor, in order to maximize the amount of CRS traffic not blocked by the censor. We use this framework to analyze several scenarios with multiple data-channel protocols used as cover for the CRS. We show that it is possible to gain insights through this framework even without perfect knowledge of the censor’s (secret values for the parameters in their utility function.

  12. Cost-effectiveness analysis of the diarrhea alleviation through zinc and oral rehydration therapy (DAZT) program in rural Gujarat India: an application of the net-benefit regression framework.

    Science.gov (United States)

    Shillcutt, Samuel D; LeFevre, Amnesty E; Fischer-Walker, Christa L; Taneja, Sunita; Black, Robert E; Mazumder, Sarmila

    2017-01-01

    This study evaluates the cost-effectiveness of the DAZT program for scaling up treatment of acute child diarrhea in Gujarat India using a net-benefit regression framework. Costs were calculated from societal and caregivers' perspectives and effectiveness was assessed in terms of coverage of zinc and both zinc and Oral Rehydration Salt. Regression models were tested in simple linear regression, with a specified set of covariates, and with a specified set of covariates and interaction terms using linear regression with endogenous treatment effects was used as the reference case. The DAZT program was cost-effective with over 95% certainty above $5.50 and $7.50 per appropriately treated child in the unadjusted and adjusted models respectively, with specifications including interaction terms being cost-effective with 85-97% certainty. Findings from this study should be combined with other evidence when considering decisions to scale up programs such as the DAZT program to promote the use of ORS and zinc to treat child diarrhea.

  13. The Need for Killer Examples for Object-Oriented Frameworks

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard; Christensen, Henrik Bærbak

    2003-01-01

    In this paper, we argue in favor of introducing object-oriented frameworks as an important topic in our software engineering teaching. Frameworks provide a basis for students to build interesting and impressive programs even with small programming effort at the introductory level. Frameworks...

  14. High-Fidelity Aerothermal Engineering Analysis for Planetary Probes Using DOTNET Framework and OLAP Cubes Database

    Directory of Open Access Journals (Sweden)

    Prabhakar Subrahmanyam

    2009-01-01

    Full Text Available This publication presents the architecture integration and implementation of various modules in Sparta framework. Sparta is a trajectory engine that is hooked to an Online Analytical Processing (OLAP database for Multi-dimensional analysis capability. OLAP is an Online Analytical Processing database that has a comprehensive list of atmospheric entry probes and their vehicle dimensions, trajectory data, aero-thermal data and material properties like Carbon, Silicon and Carbon-Phenolic based Ablators. An approach is presented for dynamic TPS design. OLAP has the capability to run in one simulation several different trajectory conditions and the output is stored back into the database and can be queried for appropriate trajectory type. An OLAP simulation can be setup by spawning individual threads to run for three types of trajectory: Nominal, Undershoot and Overshoot trajectory. Sparta graphical user interface provides capabilities to choose from a list of flight vehicles or enter trajectory and geometry information of a vehicle in design. DOTNET framework acts as a middleware layer between the trajectory engine and the user interface and also between the web user interface and the OLAP database. Trajectory output can be obtained in TecPlot format, Excel output or in a KML (Keyhole Markup Language format. Framework employs an API (application programming interface to convert trajectory data into a formatted KML file that is used by Google Earth for simulating Earth-entry fly-by visualizations.

  15. Benchmarking JavaScript Frameworks

    OpenAIRE

    Mariano, Carl Lawrence

    2017-01-01

    JavaScript programming language has been in existence for many years already and is one of the most widely known, if not, the most used front-end programming language in web development. However, JavaScript is still evolving and with the emergence of JavaScript Frameworks (JSF), there has been a major change in how developers develop software nowadays. Developers these days often use more than one framework in order to fulfil their job which has given rise to the problem for developers when i...

  16. Tatool: a Java-based open-source programming framework for psychological studies.

    Science.gov (United States)

    von Bastian, Claudia C; Locher, André; Ruflin, Michael

    2013-03-01

    Tatool (Training and Testing Tool) was developed to assist researchers with programming training software, experiments, and questionnaires. Tatool is Java-based, and thus is a platform-independent and object-oriented framework. The architecture was designed to meet the requirements of experimental designs and provides a large number of predefined functions that are useful in psychological studies. Tatool comprises features crucial for training studies (e.g., configurable training schedules, adaptive training algorithms, and individual training statistics) and allows for running studies online via Java Web Start. The accompanying "Tatool Online" platform provides the possibility to manage studies and participants' data easily with a Web-based interface. Tatool is published open source under the GNU Lesser General Public License, and is available at www.tatool.ch.

  17. PRE: A framework for enterprise integration

    Energy Technology Data Exchange (ETDEWEB)

    Whiteside, R.A.; Friedman-Hill, E.J. [Sandia National Labs., Livermore, CA (United States); Detry, R.J. [Sandia National Labs., Albuquerque, NM (United States)

    1998-03-01

    Sandia National Laboratories` Product Realization Environment (PRE) is a lightweight, CORBA based framework for the integration of a broad variety of applications. These applications are wrapped for use in the PRE framework as reusable components. For example, some of the PRE components currently available include: (1) product data management (PDM) system, (2) human resources database, several finite element analysis programs, and (3) a variety of image and document format converters. PRE enables the development of end user applications (as Java applets, for example) that use these components as building blocks. To aid such development, the PreLib library (available in both C++ and Java) permits both wrapping and using these components without knowledge of either CORBA or the security mechanisms used.

  18. XML Graphs in Program Analysis

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael I.

    2011-01-01

    of XML graphs against different XML schema languages, and provide a software package that enables others to make use of these ideas. We also survey the use of XML graphs for program analysis with four very different languages: XACT (XML in Java), Java Servlets (Web application programming), XSugar......XML graphs have shown to be a simple and effective formalism for representing sets of XML documents in program analysis. It has evolved through a six year period with variants tailored for a range of applications. We present a unified definition, outline the key properties including validation...

  19. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P; Volynets, O

    2011-01-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  20. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    Science.gov (United States)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  1. Credibilistic programming an introduction to models and applications

    CERN Document Server

    2014-01-01

    It provides fuzzy programming approach to solve real-life decision problems in fuzzy environment. Within the framework of credibility theory, it provides a self-contained, comprehensive and up-to-date presentation of fuzzy programming models, algorithms and applications in portfolio analysis.

  2. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  3. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  4. Stress analysis program system for nuclear vessel: STANSAS

    International Nuclear Information System (INIS)

    Okamoto, Asao; Michikami, Shinsuke

    1979-01-01

    IHI has developed a computer system of stress analysis and evaluation for nuclear vessels: STANSAS (STress ANalysis System for Axi-symmetric Structure). The system consists of more than twenty independent programs divided into the following six parts. 1. Programs for opening design by code rule. 2. Calculation model generating programs. 3. Load defining programs. 4. Structural analysis programs. 5. Load data/calculation results plotting programs. 6. Stress evaluation programs. Each program is connected with its pre- or post-processor through three data-bases which enable automatic data transfer. The user can make his choice of structural analysis programs in accordance with the problem to be solved. The interface to STANSAS can be easily installed in generalized structural analysis programs such as NASTRAN and MARC. For almost all tables and figures in the stress report, STANSAS has the function to print or plot out. The complicated procedures of ''Design by Analysis'' for pressure vessels have been well standardized by STANSAS. The system will give a high degree of efficiency and confidence to the design work. (author)

  5. The Development of a Program Engagement Theory for Group Offending Behavior Programs.

    Science.gov (United States)

    Holdsworth, Emma; Bowen, Erica; Brown, Sarah; Howat, Douglas

    2017-10-01

    Offender engagement in group offending behavior programs is poorly understood and under-theorized. In addition, there is no research on facilitators' engagement. This article presents the first ever theory to address this gap. A Program Engagement Theory (PET) was derived from a constructivist grounded theory analysis that accounts for both facilitators' and offenders' engagement in group offending behavior programs (GOBPs). Interviews and session observations were used to collect data from 23 program facilitators and 28 offenders (group members). The analysis revealed that group members' engagement involved shared identities and moving on as a group. In turn, this was dependent on facilitators personalising treatment frameworks and establishing a hook to help group members move on. The PET emphasizes the importance of considering change during treatment as a process rather than simply a program outcome. Solution-focused (SF) programs were more conducive to engagement and the change process than offence-focused programs.

  6. V&V framework

    Energy Technology Data Exchange (ETDEWEB)

    Hills, Richard G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Maniaci, David Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Naughton, Jonathan W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3) uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.

  7. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  8. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). System Sustainment & Readiness Technologies Dept.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineering system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.

  9. Compiling Scientific Programs for Scalable Parallel Systems

    National Research Council Canada - National Science Library

    Kennedy, Ken

    2001-01-01

    ...). The research performed in this project included new techniques for recognizing implicit parallelism in sequential programs, a powerful and precise set-based framework for analysis and transformation...

  10. Conceptual framework for development of comprehensive e-health evaluation tool.

    Science.gov (United States)

    Khoja, Shariq; Durrani, Hammad; Scott, Richard E; Sajwani, Afroz; Piryani, Usha

    2013-01-01

    The main objective of this study was to develop an e-health evaluation tool based on a conceptual framework including relevant theories for evaluating use of technology in health programs. This article presents the development of an evaluation framework for e-health programs. The study was divided into three stages: Stage 1 involved a detailed literature search of different theories and concepts on evaluation of e-health, Stage 2 plotted e-health theories to identify relevant themes, and Stage 3 developed a matrix of evaluation themes and stages of e-health programs. The framework identifies and defines different stages of e-health programs and then applies evaluation theories to each of these stages for development of the evaluation tool. This framework builds on existing theories of health and technology evaluation and presents a conceptual framework for developing an e-health evaluation tool to examine and measure different factors that play a definite role in the success of e-health programs. The framework on the horizontal axis divides e-health into different stages of program implementation, while the vertical axis identifies different themes and areas of consideration for e-health evaluation. The framework helps understand various aspects of e-health programs and their impact that require evaluation at different stages of the life cycle. The study led to the development of a new and comprehensive e-health evaluation tool, named the Khoja-Durrani-Scott Framework for e-Health Evaluation.

  11. Generic Formal Framework for Compositional Analysis of Hierarchical Scheduling Systems

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; Hyun Kim, Jin; Thi Xuan Phan, Linh

    We present a compositional framework for the specification and analysis of hierarchical scheduling systems (HSS). Firstly we provide a generic formal model, which can be used to describe any type of scheduling system. The concept of Job automata is introduced in order to model job instantiation...

  12. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  13. Continuous quality improvement in a Maltese hospital using logical framework analysis.

    Science.gov (United States)

    Buttigieg, Sandra C; Gauci, Dorothy; Dey, Prasanta

    2016-10-10

    Purpose The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.

  14. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  15. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ``Energy Efficiency, Developing Countries, and Eastern Europe,`` part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program`s researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  16. Suicide Risk Assessment Training for Psychology Doctoral Programs: Core Competencies and a Framework for Training

    OpenAIRE

    Cramer, Robert J.; Johnson, Shara M.; McLaughlin, Jennifer; Rausch, Emilie M.; Conroy, Mary Alice

    2013-01-01

    Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are ...

  17. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  18. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1985-01-01

    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  19. Sustainability assessment of nuclear power: Discourse analysis of IAEA and IPCC frameworks

    International Nuclear Information System (INIS)

    Verbruggen, Aviel; Laes, Erik

    2015-01-01

    Highlights: • Sustainability assessments (SAs) are methodologically precarious. • Discourse analysis reveals how the meaning of sustainability is constructed in SAs. • Discourse analysis is applied on the SAs of nuclear power of IAEA and IPCC. • For IAEA ‘sustainable’ equals ‘complying with best international practices’. • The IAEA framework largely inspires IPCC Fifth Assessment Report. - Abstract: Sustainability assessments (SAs) are methodologically precarious. Value-based judgments inevitably play a role in setting the scope of the SA, selecting assessment criteria and indicators, collecting adequate data, and developing and using models of considered systems. Discourse analysis can reveal how the meaning and operationalization of sustainability is constructed in and through SAs. Our discourse-analytical approach investigates how sustainability is channeled from ‘manifest image’ (broad but shallow), to ‘vision’, to ‘policy targets’ (specific and practical). This approach is applied on the SA frameworks used by IAEA and IPCC to assess the sustainability of the nuclear power option. The essentially problematic conclusion is that both SA frameworks are constructed in order to obtain answers that do not conflict with prior commitments adopted by the two institutes. For IAEA ‘sustainable’ equals ‘complying with best international practices and standards’. IPCC wrestles with its mission as a provider of “policy-relevant and yet policy-neutral, never policy-prescriptive” knowledge to decision-makers. IPCC avoids the assessment of different visions on the role of nuclear power in a low-carbon energy future, and skips most literature critical of nuclear power. The IAEA framework largely inspires IPCC AR5

  20. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  1. DRUG EVALUATION AND DECISION MAKING IN CATALONIA: DEVELOPMENT AND VALIDATION OF A METHODOLOGICAL FRAMEWORK BASED ON MULTI-CRITERIA DECISION ANALYSIS (MCDA) FOR ORPHAN DRUGS.

    Science.gov (United States)

    Gilabert-Perramon, Antoni; Torrent-Farnell, Josep; Catalan, Arancha; Prat, Alba; Fontanet, Manel; Puig-Peiró, Ruth; Merino-Montero, Sandra; Khoury, Hanane; Goetghebeur, Mireille M; Badia, Xavier

    2017-01-01

    The aim of this study was to adapt and assess the value of a Multi-Criteria Decision Analysis (MCDA) framework (EVIDEM) for the evaluation of Orphan drugs in Catalonia (Catalan Health Service). The standard evaluation and decision-making procedures of CatSalut were compared with the EVIDEM methodology and contents. The EVIDEM framework was adapted to the Catalan context, focusing on the evaluation of Orphan drugs (PASFTAC program), during a Workshop with sixteen PASFTAC members. The criteria weighting was done using two different techniques (nonhierarchical and hierarchical). Reliability was assessed by re-test. The EVIDEM framework and methodology was found useful and feasible for Orphan drugs evaluation and decision making in Catalonia. All the criteria considered for the development of the CatSalut Technical Reports and decision making were considered in the framework. Nevertheless, the framework could improve the reporting of some of these criteria (i.e., "unmet needs" or "nonmedical costs"). Some Contextual criteria were removed (i.e., "Mandate and scope of healthcare system", "Environmental impact") or adapted ("population priorities and access") for CatSalut purposes. Independently of the weighting technique considered, the most important evaluation criteria identified for orphan drugs were: "disease severity", "unmet needs" and "comparative effectiveness", while the "size of the population" had the lowest relevance for decision making. Test-retest analysis showed weight consistency among techniques, supporting reliability overtime. MCDA (EVIDEM framework) could be a useful tool to complement the current evaluation methods of CatSalut, contributing to standardization and pragmatism, providing a method to tackle ethical dilemmas and facilitating discussions related to decision making.

  2. A comparative analysis of protected area planning and management frameworks

    Science.gov (United States)

    Per Nilsen; Grant Tayler

    1997-01-01

    A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...

  3. A Demonstrative Analysis of News Articles Using Fairclough’s Critical Discourse Analysis Framework

    Directory of Open Access Journals (Sweden)

    Roy Randy Y. Briones

    2017-07-01

    Full Text Available This paper attempts to demonstrate Norman Fairclough’s Critical Discourse Analysis (CDA framework by conducting internal and external level analyses on two online news articles that report on the Moro Islamic Liberation Front’s (MILF submission of its findings on the “Mamasapano Incident” that happened in the Philippines in 2015. In performing analyses using this framework, the social context and background for these texts, as well as the relationship between the internal discourse features and the external social practices and structures in which the texts were produced are thoroughly examined. As a result, it can be noted that from the texts’ internal discourse features, the news articles portray ideological and social distinctions among social actors such as the Philippine Senate, the SAF troopers, the MILF, the MILF fighters, and the civilians. Moreover, from the viewpoint of the texts as being external social practices, the texts maintain institutional identities as news reports, but they also reveal some evaluative stance as exemplified by the adjectival phrases that the writers employed. Having both the internal and external features examined, it can be said that the way these texts were written seems to portray power relations that exist between the Philippine government and the MILF. Key words: Critical Discourse Analysis, discourse analysis, news articles, social practices, social structures, power relations

  4. FRAMEWORK FOR ENVIRONMENTAL DECISION-MAKING, FRED: A TOOL FOR ENVIRONMENTALLY-PREFERABLE PURCHASING

    Science.gov (United States)

    In support of the Environmentally Preferable Purchasing Program of the US EPA, the Systems Analysis Branch has developed a decision-making tool based on life cycle assessment. This tool, the Framework for Responsible Environmental Decision-making or FRED streamlines LCA by choosi...

  5. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    Science.gov (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  6. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...

  7. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  8. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  9. Framework for SEM contour analysis

    Science.gov (United States)

    Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.

    2017-03-01

    SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.

  10. Energy Analysis Program 1990 annual report

    International Nuclear Information System (INIS)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ''Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings

  11. An expert system framework for nondestructive waste assay

    International Nuclear Information System (INIS)

    Becker, G.K.

    1996-01-01

    Management and disposition of transuranic (RU) waste forms necessitates determining entrained RU and associated radioactive material quantities as per National RU Waste Characterization Program requirements. Technical justification and demonstration of a given NDA method used to determine RU mass and uncertainty in accordance with program quality assurance is difficult for many waste forms. Difficulties are typically founded in waste NDA methods that employ standards compensation and/or employment of simplifying assumptions on waste form configurations. Capability to determine and justify RU mass and mass uncertainty can be enhanced through integration of waste container data/information using expert system and empirical data-driven techniques with conventional data acquisition and analysis. Presented is a preliminary expert system framework that integrates the waste form data base, alogrithmic techniques, statistical analyses, expert domain knowledge bases, and empirical artificial intelligence modules into a cohesive system. The framework design and bases in addition to module development activities are discussed

  12. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  13. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer....... The knowledge base provides the necessary information/data during the design of the PAT system while the model library generates additional or missing data needed for design and analysis. Optimization of the PAT system design is achieved in terms of product data analysis time and/or cost of monitoring equipment......-aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...

  14. Teaching and Learning Numerical Analysis and Optimization: A Didactic Framework and Applications of Inquiry-Based Learning

    Science.gov (United States)

    Lappas, Pantelis Z.; Kritikos, Manolis N.

    2018-01-01

    The main objective of this paper is to propose a didactic framework for teaching Applied Mathematics in higher education. After describing the structure of the framework, several applications of inquiry-based learning in teaching numerical analysis and optimization are provided to illustrate the potential of the proposed framework. The framework…

  15. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  16. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  17. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  18. From bricks to buildings: adapting the Medical Research Council framework to develop programs of research in simulation education and training for the health professions.

    Science.gov (United States)

    Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam

    2014-08-01

    Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.

  19. Introduction of new technologies and decision making processes: a framework to adapt a Local Health Technology Decision Support Program for other local settings

    Directory of Open Access Journals (Sweden)

    Poulin P

    2013-11-01

    Full Text Available Paule Poulin,1 Lea Austen,1 Catherine M Scott,2 Michelle Poulin,1 Nadine Gall,2 Judy Seidel,3 René Lafrenière1 1Department of Surgery, 2Knowledge Management, 3Public Health Innovation and Decision Support, Alberta Health Services, Calgary, AB, Canada Purpose: Introducing new health technologies, including medical devices, into a local setting in a safe, effective, and transparent manner is a complex process, involving many disciplines and players within an organization. Decision making should be systematic, consistent, and transparent. It should involve translating and integrating scientific evidence, such as health technology assessment (HTA reports, with context-sensitive evidence to develop recommendations on whether and under what conditions a new technology will be introduced. However, the development of a program to support such decision making can require considerable time and resources. An alternative is to adapt a preexisting program to the new setting. Materials and methods: We describe a framework for adapting the Local HTA Decision Support Program, originally developed by the Department of Surgery and Surgical Services (Calgary, AB, Canada, for use by other departments. The framework consists of six steps: 1 development of a program review and adaptation manual, 2 education and readiness assessment of interested departments, 3 evaluation of the program by individual departments, 4 joint evaluation via retreats, 5 synthesis of feedback and program revision, and 6 evaluation of the adaptation process. Results: Nine departments revised the Local HTA Decision Support Program and expressed strong satisfaction with the adaptation process. Key elements for success were identified. Conclusion: Adaptation of a preexisting program may reduce duplication of effort, save resources, raise the health care providers' awareness of HTA, and foster constructive stakeholder engagement, which enhances the legitimacy of evidence

  20. The Utility of the Memorable Messages Framework as an Intermediary Evaluation Tool for Fruit and Vegetable Consumption in a Nutrition Education Program

    Science.gov (United States)

    Davis, LaShara A.; Morgan, Susan E.; Mobley, Amy R.

    2016-01-01

    Additional strategies to evaluate the impact of community nutrition education programs on low-income individuals are needed. The objective of this qualitative study was to examine the use of the Memorable Messages Framework as an intermediary nutrition education program evaluation tool to determine what fruit and vegetable messages were reported…

  1. GENOVA: a generalized perturbation theory program for various applications to CANDU core physics analysis (II) - a user's manual

    International Nuclear Information System (INIS)

    Kim, Do Heon; Choi, Hang Bok

    2001-03-01

    A user's guide for GENOVA, a GENeralized perturbation theory (GPT)-based Optimization and uncertainty analysis program for Canada deuterium uranium (CANDU) physics VAriables, was prepared. The program was developed under the framework of CANDU physics design and analysis code RFSP. The generalized perturbation method was implemented in GENOVA to estimate the zone controller unit (ZCU) level upon refueling operation and calculate various sensitivity coefficients for fuel management study and uncertainty analyses, respectively. This documentation contains descriptions and directions of four major modules of GENOVA such as ADJOINT, GADJINT, PERTURB, and PERTXS so that it can be used as a practical guide for GENOVA users. This documentation includes sample inputs for the ZCU level estimation and sensitivity coefficient calculation, which are the main application of GENOVA. The GENOVA can be used as a supplementary tool of the current CANDU physics design code for advanced CANDU core analysis and fuel development

  2. Digital Trade Infrastructures: A Framework for Analysis

    Directory of Open Access Journals (Sweden)

    Boriana Boriana

    2018-04-01

    Full Text Available In global supply chains, information about transactions resides in fragmented pockets within business and government systems. The lack of reliable, accurate and complete information makes it hard to detect risks (such as safety, security, compliance and commercial risks and at the same time makes international trade inefficient. The introduction of digital infrastructures that transcend organizational and system domains is driven by the prospect of reducing the fragmentation of information, thereby enabling improved security and efficiency in the trading process. This article develops a digital trade infrastructure framework through an empirically grounded analysis of four digital infrastructures in the trade domain, using the conceptual lens of digital infrastructure.

  3. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  4. The Coronal Analysis of SHocks and Waves (CASHeW) framework

    Science.gov (United States)

    Kozarev, Kamen A.; Davey, Alisdair; Kendrick, Alexander; Hammer, Michael; Keith, Celeste

    2017-11-01

    Coronal bright fronts (CBF) are large-scale wavelike disturbances in the solar corona, related to solar eruptions. They are observed (mostly in extreme ultraviolet (EUV) light) as transient bright fronts of finite width, propagating away from the eruption source location. Recent studies of individual solar eruptive events have used EUV observations of CBFs and metric radio type II burst observations to show the intimate connection between waves in the low corona and coronal mass ejection (CME)-driven shocks. EUV imaging with the atmospheric imaging assembly instrument on the solar dynamics observatory has proven particularly useful for detecting large-scale short-lived CBFs, which, combined with radio and in situ observations, holds great promise for early CME-driven shock characterization capability. This characterization can further be automated, and related to models of particle acceleration to produce estimates of particle fluxes in the corona and in the near Earth environment early in events. We present a framework for the coronal analysis of shocks and waves (CASHeW). It combines analysis of NASA Heliophysics System Observatory data products and relevant data-driven models, into an automated system for the characterization of off-limb coronal waves and shocks and the evaluation of their capability to accelerate solar energetic particles (SEPs). The system utilizes EUV observations and models written in the interactive data language. In addition, it leverages analysis tools from the SolarSoft package of libraries, as well as third party libraries. We have tested the CASHeW framework on a representative list of coronal bright front events. Here we present its features, as well as initial results. With this framework, we hope to contribute to the overall understanding of coronal shock waves, their importance for energetic particle acceleration, as well as to the better ability to forecast SEP events fluxes.

  5. Proposal of a Methodology of Stakeholder Analysis for the Brazilian Satellite Space Program

    Directory of Open Access Journals (Sweden)

    Mônica Elizabeth Rocha de Oliveira

    2012-03-01

    Full Text Available To ensure the continuity and growth of space activities in Brazil, it is fundamental to persuade the Brazilian society and its representatives in Government about the importance of investments in space activities. Also, it is important to convince talented professionals to place space activities as an object of their interest; the best schools should also be convinced to offer courses related to the space sector; finally, innovative companies should be convinced to take part in space sector activities, looking to returns, mainly in terms of market differentiation and qualification, as a path to take part in high-technology and high-complexity projects. On the one hand, this process of convincing or, more importantly, committing these actors to space activities, implies a thorough understanding of their expectations and needs, in order to plan how the system/organization can meet them. On the other hand, if stakeholders understand how much they can benefit from this relationship, their consequent commitment will very much strengthen the action of the system/organization. With this framework in perspective, this paper proposes a methodology of stakeholder analysis for the Brazilian satellite space program. In the exercise developed in the article, stakeholders have been identified from a study of the legal framework of the Brazilian space program. Subsequently, the proposed methodology has been applied to the planning of actions by a public organization.

  6. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific-analysis computer programs for year-2000 compliance is part of AECL' s year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  7. The Y2K program for scientific-analysis computer programs at AECL

    International Nuclear Information System (INIS)

    Popovic, J.; Gaver, C.; Chapman, D.

    1999-01-01

    The evaluation of scientific analysis computer programs for year-2000 compliance is part of AECL's year-2000 (Y2K) initiative, which addresses both the infrastructure systems at AECL and AECL's products and services. This paper describes the Y2K-compliance program for scientific-analysis computer codes. This program involves the integrated evaluation of the computer hardware, middleware, and third-party software in addition to the scientific codes developed in-house. The project involves several steps: the assessment of the scientific computer programs for Y2K compliance, performing any required corrective actions, porting the programs to Y2K-compliant platforms, and verification of the programs after porting. Some programs or program versions, deemed no longer required in the year 2000 and beyond, will be retired and archived. (author)

  8. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  9. A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Ebenezer Out-Nyarko

    2009-11-01

    Full Text Available Using Hidden Markov Models (HMMs as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.

  10. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  11. RIPOSTE: a framework for improving the design and analysis of laboratory-based research

    Science.gov (United States)

    Masca, Nicholas GD; Hensor, Elizabeth MA; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam KA; Teare, M Dawn

    2015-01-01

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results. DOI: http://dx.doi.org/10.7554/eLife.05519.001 PMID:25951517

  12. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  13. DXC'09 Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — The DXC Framework is a collection of programs and APIs for running and evaluating diagnostic algorithms (DAs). It is complementary to system XML catalogs and...

  14. Programs for nuclear data analysis

    International Nuclear Information System (INIS)

    Bell, R.A.I.

    1975-01-01

    The following report details a number of programs and subroutines which are useful for analysis of data from nuclear physics experiments. Most of them are available from pool pack 005 on the IBM1800 computer. All of these programs are stored there as core loads, and the subroutines and functions in relocatable format. The nature and location of other programs are specified as appropriate. (author)

  15. Framework for the impact analysis and implementation of Clinical Prediction Rules (CPRs)

    LENUS (Irish Health Repository)

    Wallace, Emma

    2011-10-14

    Abstract Clinical Prediction Rules (CPRs) are tools that quantify the contribution of symptoms, clinical signs and available diagnostic tests, and in doing so stratify patients according to the probability of having a target outcome or need for a specified treatment. Most focus on the derivation stage with only a minority progressing to validation and very few undergoing impact analysis. Impact analysis studies remain the most efficient way of assessing whether incorporating CPRs into a decision making process improves patient care. However there is a lack of clear methodology for the design of high quality impact analysis studies. We have developed a sequential four-phased framework based on the literature and the collective experience of our international working group to help researchers identify and overcome the specific challenges in designing and conducting an impact analysis of a CPR. There is a need to shift emphasis from deriving new CPRs to validating and implementing existing CPRs. The proposed framework provides a structured approach to this topical and complex area of research.

  16. The Pentagon's Military Analyst Program

    Science.gov (United States)

    Valeri, Andy

    2014-01-01

    This article provides an investigatory overview of the Pentagon's military analyst program, what it is, how it was implemented, and how it constitutes a form of propaganda. A technical analysis of the program is applied using the theoretical framework of the propaganda model first developed by Noam Chomsky and Edward S. Herman. Definitions…

  17. A Novel Framework for Interactive Visualization and Analysis of Hyperspectral Image Data

    Directory of Open Access Journals (Sweden)

    Johannes Jordan

    2016-01-01

    Full Text Available Multispectral and hyperspectral images are well established in various fields of application like remote sensing, astronomy, and microscopic spectroscopy. In recent years, the availability of new sensor designs, more powerful processors, and high-capacity storage further opened this imaging modality to a wider array of applications like medical diagnosis, agriculture, and cultural heritage. This necessitates new tools that allow general analysis of the image data and are intuitive to users who are new to hyperspectral imaging. We introduce a novel framework that bundles new interactive visualization techniques with powerful algorithms and is accessible through an efficient and intuitive graphical user interface. We visualize the spectral distribution of an image via parallel coordinates with a strong link to traditional visualization techniques, enabling new paradigms in hyperspectral image analysis that focus on interactive raw data exploration. We combine novel methods for supervised segmentation, global clustering, and nonlinear false-color coding to assist in the visual inspection. Our framework coined Gerbil is open source and highly modular, building on established methods and being easily extensible for application-specific needs. It satisfies the need for a general, consistent software framework that tightly integrates analysis algorithms with an intuitive, modern interface to the raw image data and algorithmic results. Gerbil finds its worldwide use in academia and industry alike with several thousand downloads originating from 45 countries.

  18. Dynamic analysis program for frame structure

    International Nuclear Information System (INIS)

    Ando, Kozo; Chiba, Toshio

    1975-01-01

    A general purpose computer program named ISTRAN/FD (Isub(HI) STRucture ANalysis/Frame structure, Dynamic analysis) has been developed for dynamic analysis of three-dimensional frame structures. This program has functions of free vibration analysis, seismic response analysis, graphic display by plotter and CRT, etc. This paper introduces ISTRAN/FD; examples of its application are shown with various problems : idealization of the cantilever, dynamic analysis of the main tower of the suspension bridge, three-dimensional vibration in the plate girder bridge, seismic response in the boiler steel structure, and dynamic properties of the underground LNG tank. In this last example, solid elements, in addition to beam elements, are especially used for the analysis. (auth.)

  19. Analyzing and modeling interdisciplinary product development a framework for the analysis of knowledge characteristics and design support

    CERN Document Server

    Neumann, Frank

    2015-01-01

    Frank Neumann focuses on establishing a theoretical basis that allows a description of the interplay between individual and collective processes in product development. For this purpose, he introduces the integrated descriptive model of knowledge creation as the first constituent of his research framework. As a second part of the research framework, an analysis and modeling method is proposed that captures the various knowledge conversion activities described by the integrated descriptive model of knowledge creation. Subsequently, this research framework is applied to the analysis of knowledge characteristics of mechatronic product development (MPD). Finally, the results gained from the previous steps are used within a design support system that aims at federating the information and knowledge resources contained in the models published in the various development activities of MPD. Contents Descriptive Model of Knowledge Creation in Interdisciplinary Product Development Research Framework for the Analysis of ...

  20. A program for activation analysis data processing

    International Nuclear Information System (INIS)

    Janczyszyn, J.; Loska, L.; Taczanowski, S.

    1978-01-01

    An ALGOL program for activation analysis data handling is presented. The program may be used either for single channel spectrometry data or for multichannel spectrometry. The calculation of instrumental error and of analysis standard deviation is carried out. The outliers are tested, and the regression line diagram with the related observations are plotted by the program. (author)

  1. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  2. PetIGA: A framework for high-performance isogeometric analysis

    KAUST Repository

    Dalcin, Lisandro; Collier, N.; Vignal, Philippe; Cortes, Adriano Mauricio; Calo, Victor M.

    2016-01-01

    We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility of PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. We show strong scaling results on up to 40964096 cores, which confirm the suitability of PetIGA for large scale simulations.

  3. PetIGA: A framework for high-performance isogeometric analysis

    KAUST Repository

    Dalcin, L.

    2016-05-25

    We present PetIGA, a code framework to approximate the solution of partial differential equations using isogeometric analysis. PetIGA can be used to assemble matrices and vectors which come from a Galerkin weak form, discretized with Non-Uniform Rational B-spline basis functions. We base our framework on PETSc, a high-performance library for the scalable solution of partial differential equations, which simplifies the development of large-scale scientific codes, provides a rich environment for prototyping, and separates parallelism from algorithm choice. We describe the implementation of PetIGA, and exemplify its use by solving a model nonlinear problem. To illustrate the robustness and flexibility of PetIGA, we solve some challenging nonlinear partial differential equations that include problems in both solid and fluid mechanics. We show strong scaling results on up to 40964096 cores, which confirm the suitability of PetIGA for large scale simulations.

  4. Development of an Analysis and Design Optimization Framework for Marine Propellers

    Science.gov (United States)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  5. Muon g-2 Reconstruction and Analysis Framework for the Muon Anomalous Precession Frequency

    Energy Technology Data Exchange (ETDEWEB)

    Khaw, Kim Siang [Washington U., Seattle

    2017-10-21

    The Muon g-2 experiment at Fermilab, with the aim to measure the muon anomalous magnetic moment to an unprecedented level of 140~ppb, has started beam and detector commissioning in Summer 2017. To deal with incoming data projected to be around tens of petabytes, a robust data reconstruction and analysis chain based on Fermilab's \\textit{art} event-processing framework is developed. Herein, I report the current status of the framework, together with its novel features such as multi-threaded algorithms for online data quality monitor (DQM) and fast-turnaround operation (nearline). Performance of the framework during the commissioning run is also discussed.

  6. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  7. Mexico's "Telesecundaria" Program and Equitable Access to Resources

    Science.gov (United States)

    Craig, Dana; Etcheverry, Jose; Ferris, Stefan

    2016-01-01

    This Note provides an analysis of Mexico's "Telesecundaria" program within the context of Mexico's new education reform framework offering a succinct background of the project, as well as key policy lessons that can be useful for other jurisdictions interested in the development of distance education programs. This Note uses a literature…

  8. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1) for spatial prediction of floods

    Science.gov (United States)

    Tien Bui, Dieu; Hoang, Nhat-Duc

    2017-09-01

    In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  9. Static Analysis of Functional Programs

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    1994-01-01

    In this paper, the static analysis of programs in the functional programming language Miranda is described based on two graph models. A new control-flow graph model of Miranda definitions is presented, and a model with four classes of caligraphs. Standard software metrics are applicable to these

  10. Surveyor Management of Hospital Accreditation Program: A Thematic Analysis Conducted in Iran.

    Science.gov (United States)

    Teymourzadeh, Ehsan; Ramezani, Mozhdeh; Arab, Mohammad; Rahimi Foroushani, Abbas; Akbari Sari, Ali

    2016-05-01

    The surveyors in hospital accreditation program are considered as the core of accreditation programs. So, the reliability and validity of the accreditation program heavily depend on their performance. This study aimed to identify the dimensions and factors affecting surveyor management of hospital accreditation programs in Iran. This qualitative study used a thematic analysis method, and was performed in Iran in 2014. The study participants included experts in the field of hospital accreditation, and were derived from three groups: 1. Policy-makers, administrators, and surveyors of the accreditation bureau, the ministry of health and medical education, Iranian universities of medical science; 2. Healthcare service providers, and 3. University professors and faculty members. The data were collected using semi-structured in-depth interviews. Following text transcription and control of compliance with the original text, MAXQDA10 software was used to code, classify, and organize the interviews in six stages. The findings from the analysis of 21 interviews were first classified in the form of 1347 semantic units, 11 themes, 17 sub-themes, and 248 codes. These were further discussed by an expert panel, which then resulted in the emergence of seven main themes - selection and recruitment of the surveyor team, organization of the surveyor team, planning to perform surveys, surveyor motivation and retention, surveyor training, surveyor assessment, and recommendations - as well as 27 sub-themes, and 112 codes. The dimensions and variables affecting the surveyors' management were identified and classified on the basis of existing scientific methods in the form of a conceptual framework. Using the results of this study, it would certainly be possible to take a great step toward enhancing the reliability of surveys and the quality and safety of services, while effectively managing accreditation program surveyors.

  11. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  12. Supportive supervision and constructive relationships with healthcare workers support CHW performance: Use of a qualitative framework to evaluate CHW programming in Uganda

    OpenAIRE

    Ludwick, Teralynn; Turyakira, Eleanor; Kyomuhangi, Teddy; Manalili, Kimberly; Robinson, Sheila; Brenner, Jennifer L.

    2018-01-01

    Background While evidence supports community health worker (CHW) capacity to improve maternal and newborn health in less-resourced countries, key implementation gaps remain. Tools for assessing CHW performance and evidence on what programmatic components affect performance are lacking. This study developed and tested a qualitative evaluative framework and tool to assess CHW team performance in a district program in rural Uganda. Methods A new assessment framework was developed to collect and ...

  13. A Framework for Security Analysis of Mobile Wireless Networks

    DEFF Research Database (Denmark)

    Nanz, Sebastian; Hankin, Chris

    2006-01-01

    processes and the network's connectivity graph, which may change independently from protocol actions. We identify a property characterising an important aspect of security in this setting and express it using behavioural equivalences of the calculus. We complement this approach with a control flow analysis......We present a framework for specification and security analysis of communication protocols for mobile wireless networks. This setting introduces new challenges which are not being addressed by classical protocol analysis techniques. The main complication stems from the fact that the actions...... of intermediate nodes and their connectivity can no longer be abstracted into a single unstructured adversarial environment as they form an inherent part of the system's security. In order to model this scenario faithfully, we present a broadcast calculus which makes a clear distinction between the protocol...

  14. Programming Entity Framework Code First

    CERN Document Server

    Lerman, Julia

    2011-01-01

    Take advantage of the Code First data modeling approach in ADO.NET Entity Framework, and learn how to build and configure a model based on existing classes in your business domain. With this concise book, you'll work hands-on with examples to learn how Code First can create an in-memory model and database by default, and how you can exert more control over the model through further configuration. Code First provides an alternative to the database first and model first approaches to the Entity Data Model. Learn the benefits of defining your model with code, whether you're working with an exis

  15. Implementation and Evaluation of Technology Mentoring Program Developed for Teacher Educators: A 6M-Framework

    Directory of Open Access Journals (Sweden)

    Selim Gunuc

    2015-06-01

    Full Text Available The purpose of this basic research is to determine the problems experienced in the Technology Mentoring Program (TMP, and the study discusses how these problems affect the process in general. The implementation was carried out with teacher educators in the education faculty. 8 doctorate students (mentors provided technology mentoring implementation for one academic term to 9 teacher educators (mentees employed in the Education Faculty. The data were collected via the mentee and the mentor interview form, mentor reflections and organization meeting reflections. As a result, the problems based on the mentor, on the mentee and on the organization/institution were determined. In order to carry out TMP more effectively and successfully, a 6M-framework (Modifying, Meeting, Matching, Managing, Mentoring - Monitoring was suggested within the scope of this study. It could be stated that fewer problems will be encountered and that the process will be carried out more effectively and successfully when the structure in this framework is taken into consideration.

  16. Analysis of Logic Programs Using Regular Tree Languages

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2012-01-01

    The eld of nite tree automata provides fundamental notations and tools for reasoning about set of terms called regular or recognizable tree languages. We consider two kinds of analysis using regular tree languages, applied to logic programs. The rst approach is to try to discover automatically...... a tree automaton from a logic program, approximating its minimal Herbrand model. In this case the input for the analysis is a program, and the output is a tree automaton. The second approach is to expose or check properties of the program that can be expressed by a given tree automaton. The input...... to the analysis is a program and a tree automaton, and the output is an abstract model of the program. These two contrasting abstract interpretations can be used in a wide range of analysis and verication problems....

  17. Pointer Analysis for JavaScript Programming Tools

    DEFF Research Database (Denmark)

    Feldthaus, Asger

    Tools that can assist the programmer with tasks, such as, refactoring or code navigation, have proven popular for Java, C#, and other programming languages. JavaScript is a widely used programming language, and its users could likewise benefit from such tools, but the dynamic nature of the language...... is an obstacle for the development of these. Because of this, tools for JavaScript have long remained ineffective compared to those for many other programming languages. Static pointer analysis can provide a foundation for more powerful tools, although the design of this analysis is itself a complicated endeavor....... In this work, we explore techniques for performing pointer analysis of JavaScript programs, and we find novel applications of these techniques. In particular, we demonstrate how these can be used for code navigation, automatic refactoring, semi-automatic refactoring of incomplete programs, and checking of type...

  18. The stapl Skeleton Framework

    KAUST Repository

    Zandifar, Mani

    2015-01-01

    © Springer International Publishing Switzerland 2015. This paper describes the stapl Skeleton Framework, a highlevel skeletal approach for parallel programming. This framework abstracts the underlying details of data distribution and parallelism from programmers and enables them to express parallel programs as a composition of existing elementary skeletons such as map, map-reduce, scan, zip, butterfly, allreduce, alltoall and user-defined custom skeletons. Skeletons in this framework are defined as parametric data flow graphs, and their compositions are defined in terms of data flow graph compositions. Defining the composition in this manner allows dependencies between skeletons to be defined in terms of point-to-point dependencies, avoiding unnecessary global synchronizations. To show the ease of composability and expressivity, we implemented the NAS Integer Sort (IS) and Embarrassingly Parallel (EP) benchmarks using skeletons and demonstrate comparable performance to the hand-optimized reference implementations. To demonstrate scalable performance, we show a transformation which enables applications written in terms of skeletons to run on more than 100,000 cores.

  19. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. A framework of analysis for field experiments with alternative materials in road construction.

    Science.gov (United States)

    François, D; Jullien, A

    2009-01-01

    In France, a wide variety of alternative materials is produced or exists in the form of stockpiles built up over time. Such materials are distributed over various regions of the territory depending on local industrial development and urbanisation trends. The use of alternative materials at a national scale implies sharing local knowledge and experience. Building a national database on alternative materials for road construction is useful in gathering and sharing information. An analysis of feedback from onsite experiences (back analysis) is essential to improve knowledge on alternative material use in road construction. Back analysis of field studies has to be conducted in accordance with a single common framework. This could enable drawing comparisons between alternative materials and between road applications. A framework for the identification and classification of data used in back analyses is proposed. Since the road structure is an open system, this framework has been based on a stress-response approach at both the material and structural levels and includes a description of external factors applying during the road service life. The proposal has been shaped from a review of the essential characteristics of road materials and structures, as well as from the state of knowledge specific to alternative material characterisation.

  1. Politics of energy and the NEP: a framework and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Toner, G B

    1984-01-01

    This dissertation examines the nature and evolution of Canadian energy politics, with the focus on the 1973-1983 period and on the oil and gas aspects of energy. The conceptual basis for undertaking the analysis is development and application of an integrated framework for the study of energy politics in Canada. The introduction of the National Energy Program (NEP) by the federal Liberal government in October, 1980, marked a significant conjuncture in the development of Canadian energy politics. The NEP was intended to be a signal of a revitalized central government as well as bargaining stance in the ongoing price and revenue sharing negotiations. Thus, the NEP must be understood as first and foremost a political act. This research suggests that energy politics must be understood as the outcome of conflict and consensus within the government industry and intergovernmental relationships of power, over the ability to influence and control energy developments. To attempt to explain energy politics as essentially the outcome of interaction between government and industry with intergovernmental relations simply reflecting intra-industry competition, or conversely, to explain energy politics as merely the toing and froing of competing governments, is to present a fundamentally flawed portrayed of Canadian energy politics. That is, the dynamic force driving energy politics in Canada is a three-sided set of competitive relations between governments and the industry.

  2. An ovine in vivo framework for tracheobronchial stent analysis.

    Science.gov (United States)

    McGrath, Donnacha J; Thiebes, Anja Lena; Cornelissen, Christian G; O'Shea, Mary B; O'Brien, Barry; Jockenhoevel, Stefan; Bruzzi, Mark; McHugh, Peter E

    2017-10-01

    Tracheobronchial stents are most commonly used to restore patency to airways stenosed by tumour growth. Currently all tracheobronchial stents are associated with complications such as stent migration, granulation tissue formation, mucous plugging and stent strut fracture. The present work develops a computational framework to evaluate tracheobronchial stent designs in vivo. Pressurised computed tomography is used to create a biomechanical lung model which takes into account the in vivo stress state, global lung deformation and local loading from pressure variation. Stent interaction with the airway is then evaluated for a number of loading conditions including normal breathing, coughing and ventilation. Results of the analysis indicate that three of the major complications associated with tracheobronchial stents can potentially be analysed with this framework, which can be readily applied to the human case. Airway deformation caused by lung motion is shown to have a significant effect on stent mechanical performance, including implications for stent migration, granulation formation and stent fracture.

  3. CREATION OF IT-ORIENTED ONTOLOGICAL FRAMEWORK FOR THE PURPOSE OF MAKING EDUCATIONAL PROGRAMS ON THE BASE OF COMPETENCIES

    Directory of Open Access Journals (Sweden)

    G. M. Korotenko

    2017-08-01

    Full Text Available Purpose. Taking into account the expansion of computing application scopes there is a need to identify the links and features of the constantly emerging professional competencies of the new sections of computing knowledge to improve the process of forming new curricula. Methodology. Authors propose the new approach aimed to build specialized knowledge bases generated using artificial intelligence technology and focused on the use of multiple heterogeneous resources or data sources on specific educational topics is proposed. As a tool, ensuring the formation of the base ontology the Protégé 4.2 ontology editor is used. As one of the modules of the developed system of semantic analysis, which provides access to ontology and the possibility of its processing, the Apache Jena Java framework should be used, which forms the software environment for working with data in RDF, RDFS and OWL formats, and also supports the ability to form queries to Ontologies in the SPARQL language. The peculiarity of this approach is the binding of information resources of the three-platform presentation of the disciplinary structure in the context of identifying the links of professional competencies. Findings. The model and structure of the IT-oriented ontological framework designed to ensure the components convergence of the university three-platform information and communication environment are developed. The structure of the knowledge base ontology-basis, describing the main essence of the educational standards of the "Information Technologies" branch is formed. Originality. Within the framework of design and formation of the knowledge sector disciplinary structure "Information Technologies" in the context of the competence approach to education, the architecture of the competence descriptors of semantic analysis system is proposed. It implements the algorithm for integrating the ontological and product models of knowledge representation about the subject domain

  4. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    Science.gov (United States)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  5. Biodiesel Emissions Analysis Program

    Science.gov (United States)

    Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.

  6. Disease Management, Case Management, Care Management, and Care Coordination: A Framework and a Brief Manual for Care Programs and Staff.

    Science.gov (United States)

    Ahmed, Osman I

    2016-01-01

    With the changing landscape of health care delivery in the United States since the passage of the Patient Protection and Affordable Care Act in 2010, health care organizations have struggled to keep pace with the evolving paradigm, particularly as it pertains to population health management. New nomenclature emerged to describe components of the new environment, and familiar words were put to use in an entirely different context. This article proposes a working framework for activities performed in case management, disease management, care management, and care coordination. The author offers standard working definitions for some of the most frequently used words in the health care industry with the goal of increasing consistency for their use, especially in the backdrop of the Centers for Medicaid & Medicare Services offering a "chronic case management fee" to primary care providers for managing the sickest, high-cost Medicare patients. Health care organizations performing case management, care management, disease management, and care coordination. Road map for consistency among users, in reporting, comparison, and for success of care management/coordination programs. This article offers a working framework for disease managers, case and care managers, and care coordinators. It suggests standard definitions to use for disease management, case management, care management, and care coordination. Moreover, the use of clear terminology will facilitate comparing, contrasting, and evaluating all care programs and increase consistency. The article can improve understanding of care program components and success factors, estimate program value and effectiveness, heighten awareness of consumer engagement tools, recognize current state and challenges for care programs, understand the role of health information technology solutions in care programs, and use information and knowledge gained to assess and improve care programs to design the "next generation" of programs.

  7. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part I: Template-Based Generic Programming

    Directory of Open Access Journals (Sweden)

    Roger P. Pawlowski

    2012-01-01

    Full Text Available An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.

  8. The Policy Formation Process: A Conceptual Framework for Analysis. Ph.D. Thesis

    Science.gov (United States)

    Fuchs, E. F.

    1972-01-01

    A conceptual framework for analysis which is intended to assist both the policy analyst and the policy researcher in their empirical investigations into policy phenomena is developed. It is meant to facilitate understanding of the policy formation process by focusing attention on the basic forces shaping the main features of policy formation as a dynamic social-political-organizational process. The primary contribution of the framework lies in its capability to suggest useful ways of looking at policy formation reality. It provides the analyst and the researcher with a group of indicators which suggest where to look and what to look for when attempting to analyze and understand the mix of forces which energize, maintain, and direct the operation of strategic level policy systems. The framework also highlights interconnections, linkage, and relational patterns between and among important variables. The framework offers an integrated set of conceptual tools which facilitate understanding of and research on the complex and dynamic set of variables which interact in any major strategic level policy formation process.

  9. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  10. Three-dimensional finite element analysis of zirconia all-ceramic cantilevered fixed partial dentures with different framework designs.

    Science.gov (United States)

    Miura, Shoko; Kasahara, Shin; Yamauchi, Shinobu; Egusa, Hiroshi

    2017-06-01

    The purpose of this study were: to perform stress analyses using three-dimensional finite element analysis methods; to analyze the mechanical stress of different framework designs; and to investigate framework designs that will provide for the long-term stability of both cantilevered fixed partial dentures (FPDs) and abutment teeth. An analysis model was prepared for three units of cantilevered FPDs that assume a missing mandibular first molar. Four types of framework design (Design 1, basic type; Design 2, framework width expanded buccolingually by 2 mm; Design 3, framework height expanded by 0.5 mm to the occlusal surface side from the end abutment to the connector area; and Design 4, a combination of Designs 2 and 3) were created. Two types of framework material (yttrium-oxide partially stabilized zirconia and a high precious noble metal gold alloy) and two types of abutment material (dentin and brass) were used. In the framework designs, Design 1 exhibited the highest maximum principal stress value for both zirconia and gold alloy. In the abutment tooth, Design 3 exhibited the highest maximum principal stress value for all abutment teeth. In the present study, Design 4 (the design with expanded framework height and framework width) could contribute to preventing the concentration of stress and protecting abutment teeth. © 2017 Eur J Oral Sci.

  11. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1 for spatial prediction of floods

    Directory of Open Access Journals (Sweden)

    D. Tien Bui

    2017-09-01

    Full Text Available In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM, radial-basis-function Fisher discriminant analysis (RBFDA, and a geographic information system (GIS database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  12. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  13. Visualization framework for CAVE virtual reality systems

    OpenAIRE

    Kageyama, Akira; Tomiyama, Asako

    2016-01-01

    We have developed a software framework for scientific visualization in immersive-type, room-sized virtual reality (VR) systems, or Cave automatic virtual environment (CAVEs). This program, called Multiverse, allows users to select and invoke visualization programs without leaving CAVE’s VR space. Multiverse is a kind of immersive “desktop environment” for users, with a three-dimensional graphical user interface. For application developers, Multiverse is a software framework with useful class ...

  14. Towards Interactive Visual Exploration of Parallel Programs using a Domain-Specific Language

    KAUST Repository

    Klein, Tobias

    2016-04-19

    The use of GPUs and the massively parallel computing paradigm have become wide-spread. We describe a framework for the interactive visualization and visual analysis of the run-time behavior of massively parallel programs, especially OpenCL kernels. This facilitates understanding a program\\'s function and structure, finding the causes of possible slowdowns, locating program bugs, and interactively exploring and visually comparing different code variants in order to improve performance and correctness. Our approach enables very specific, user-centered analysis, both in terms of the recording of the run-time behavior and the visualization itself. Instead of having to manually write instrumented code to record data, simple code annotations tell the source-to-source compiler which code instrumentation to generate automatically. The visualization part of our framework then enables the interactive analysis of kernel run-time behavior in a way that can be very specific to a particular problem or optimization goal, such as analyzing the causes of memory bank conflicts or understanding an entire parallel algorithm.

  15. Post-disaster psychosocial support and quality improvement: A conceptual framework for understanding and improving the quality of psychosocial support programs

    NARCIS (Netherlands)

    Dückers, Michel L. A.; Thormar, Sigridur B.

    2015-01-01

    This article is original in that it addresses post-disaster psychosocial support programs from a quality-improvement perspective, not from the traditional viewpoint of mental health services. Based on a combination of renowned quality models, a framework is sketched that offers chances to better

  16. The nuclear analysis program at MURR

    International Nuclear Information System (INIS)

    Glascock, M.D.

    1993-01-01

    The University of Missouri-Columbia (MU) has continually upgraded research facilities and programs at the MU research reactor (MURR) throughout its 26-yr history. The Nuclear Analysis Program (NAP) area has participated in these upgrades over the years. As one of the largest activation analysis laboratories on a university campus, the activities of the NAP are broadly representative of the diversity of applications for activation analysis and related nuclear science. This paper describes the MURR's NAP and several of the research, education, and service projects in which the laboratory is currently engaged

  17. Evolution of the ATLAS Software Framework towards Concurrency

    CERN Document Server

    Jones, Roger; The ATLAS collaboration; Leggett, Charles; Wynne, Benjamin

    2015-01-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather r...

  18. SPANDE, Stress Analysis of General Space-frame and Pipework. SPATAM, Tilt Angle Calculation of Framework for Program SPANDE

    International Nuclear Information System (INIS)

    Davies, D.C.; Enderby, J.A.; Knowles, J.A.

    1984-01-01

    1 - Nature of physical problem solved: The programme is intended to analyse almost any type of space-frame. Members of the frame may be either straight or of constant curvature between nodes provided that, in the case of curved members, one of the principal axes of the cross-section of the member lies in the same plane as the member. Loading may comprise concentrated loads, distributed loads, thermal loads or may take the form of specified displacements. The programme calculates the forces and moments in all the members of the framework, the reactions at all external restraints and the displacements of all the nodes. For pipework problems, the maximum stress difference in the pipe, calculated in accordance with the code of practice, is also quoted. 2 - Method of solution: The framework is solved by displacement methods involving stiffness matrices, making it possible to analyse space-frames with virtually any number of redundancies. 3 - Restrictions on the complexity of the problem: For ICL 4/70, the framework is limited to 1000 nodes, 2000 members, 100 different member types or 200 specified nodal displacements

  19. Program Analysis and Its Relevance for Educational Research

    Directory of Open Access Journals (Sweden)

    Bernd Käpplinger

    2008-01-01

    Full Text Available Program analyses are frequently used in research on continuing education. The use of such analyses will be described in this article. Existing data sources, research topics, qualitative, quantitative and mixed methods, will be discussed. Three types of program analysis will be developed. The article ends with a discussion of the advantages and disadvantages of program analysis in contrast to questionnaires. Future developments and challenges will be sketched in the conclusion. Recommendations for the future development of program analysis will be given. URN: urn:nbn:de:0114-fqs0801379

  20. Evaluation of capacity-building program of district health managers in India: a contextualized theoretical framework.

    Science.gov (United States)

    Prashanth, N S; Marchal, Bruno; Kegels, Guy; Criel, Bart

    2014-01-01

    Performance of local health services managers at district level is crucial to ensure that health services are of good quality and cater to the health needs of the population in the area. In many low- and middle-income countries, health services managers are poorly equipped with public health management capacities needed for planning and managing their local health system. In the south Indian Tumkur district, a consortium of five non-governmental organizations partnered with the state government to organize a capacity-building program for health managers. The program consisted of a mix of periodic contact classes, mentoring and assignments and was spread over 30 months. In this paper, we develop a theoretical framework in the form of a refined program theory to understand how such a capacity-building program could bring about organizational change. A well-formulated program theory enables an understanding of how interventions could bring about improvements and an evaluation of the intervention. In the refined program theory of the intervention, we identified various factors at individual, institutional, and environmental levels that could interact with the hypothesized mechanisms of organizational change, such as staff's perceived self-efficacy and commitment to their organizations. Based on this program theory, we formulated context-mechanism-outcome configurations that can be used to evaluate the intervention and, more specifically, to understand what worked, for whom and under what conditions. We discuss the application of program theory development in conducting a realist evaluation. Realist evaluation embraces principles of systems thinking by providing a method for understanding how elements of the system interact with one another in producing a given outcome.

  1. Environmental risk analysis for nanomaterials: Review and evaluation of frameworks

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    2012-01-01

    to occupational settings with minor environmental considerations, and most have not been thoroughly tested on a wide range of NM. Care should also be taken when selecting the most appropriate risk analysis strategy for a given risk context. Given this, we recommend a multi-faceted approach to assess...... the environmental risks of NM as well as increased applications and testing of the proposed frameworks for different NM....

  2. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  3. Integrating Poverty and Environmental Concerns into Value-Chain Analysis: A Strategic Framework and Practical Guide

    DEFF Research Database (Denmark)

    Riisgaard, Lone; Bolwig, Simon; Ponte, Stefano

    2010-01-01

    This article aims to guide the design and implementation of action-research projects in value-chain analysis by presenting a strategic framework focused on small producers and trading and processing firms in developing countries. Its stepwise approach – building on the conceptual framework set ou...... purpose of increasing the rewards and/or reducing the risks....

  4. California Curriculum Frameworks: A Handbook for Production, Implementation, and Evaluation Activities.

    Science.gov (United States)

    California State Dept. of Education, Sacramento.

    This booklet describes the characteristics and role of curriculum frameworks and describes how they can be used in developing educational programs. It is designed as a guide for writers of frameworks, for educators who are responsible for implementing frameworks, or for evaluators of educational programs. It provides a concise description of the…

  5. Bricklayer Static Analysis

    Science.gov (United States)

    Harris, Christopher

    In the U.S., science and math are taking spotlight in education, and rightfully so as they directly impact economic progression. Curiously absent is computer science, which despite its numerous job opportunities and growth does not have as much focus. This thesis develops a source code analysis framework using language translation, and machine learning classifiers to analyze programs written in Bricklayer for the purposes of programmatically identifying relative success or failure of a students Bricklayer program, helping teachers scale in the amount of students they can support, and providing better messaging. The thesis uses as a case study a set of student programs to demonstrate the possibilities of the framework.

  6. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    Science.gov (United States)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  7. A unified framework for risk and vulnerability analysis covering both safety and security

    International Nuclear Information System (INIS)

    Aven, Terje

    2007-01-01

    Recently, we have seen several attempts to establish adequate risk and vulnerability analyses tools and related management frameworks dealing not only with accidental events but also security problems. These attempts have been based on different analysis approaches and using alternative building blocks. In this paper, we discuss some of these and show how a unified framework for such analyses and management tasks can be developed. The framework is based on the use of probability as a measure of uncertainty, as seen through the eyes of the assessor, and define risk as the combination of possible consequences and related uncertainties. Risk and vulnerability characterizations are introduced incorporating ideas both from vulnerability analyses literature as well as from the risk classification scheme introduced by Renn and Klinke

  8. Status of CHAP: composite HTGR analysis program

    International Nuclear Information System (INIS)

    Secker, P.A.; Gilbert, J.S.

    1975-12-01

    Development of an HTGR accident simulation program is in progress for the prediction of the overall HTGR plant transient response to various initiating events. The status of the digital computer program named CHAP (Composite HTGR Analysis Program) as of June 30, 1975, is given. The philosophy, structure, and capabilities of the CHAP code are discussed. Mathematical descriptions are given for those HTGR components that have been modeled. Component model validation and evaluation using auxiliary analysis codes are also discussed

  9. Single-shell tank retrieval program mission analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Stokes, W.J.

    1998-08-11

    This Mission Analysis Report was prepared to provide the foundation for the Single-Shell Tank (SST) Retrieval Program, a new program responsible for waste removal for the SSTS. The SST Retrieval Program is integrated with other Tank Waste Remediation System activities that provide the management, technical, and operations elements associated with planning and execution of SST and SST Farm retrieval and closure. This Mission Analysis Report provides the basis and strategy for developing a program plan for SST retrieval. This Mission Analysis Report responds to a US Department of Energy request for an alternative single-shell tank retrieval approach (Taylor 1997).

  10. Single-shell tank retrieval program mission analysis report

    International Nuclear Information System (INIS)

    Stokes, W.J.

    1998-01-01

    This Mission Analysis Report was prepared to provide the foundation for the Single-Shell Tank (SST) Retrieval Program, a new program responsible for waste removal for the SSTS. The SST Retrieval Program is integrated with other Tank Waste Remediation System activities that provide the management, technical, and operations elements associated with planning and execution of SST and SST Farm retrieval and closure. This Mission Analysis Report provides the basis and strategy for developing a program plan for SST retrieval. This Mission Analysis Report responds to a US Department of Energy request for an alternative single-shell tank retrieval approach (Taylor 1997)

  11. A unified framework of descent algorithms for nonlinear programs and variational inequalities

    International Nuclear Information System (INIS)

    Patriksson, M.

    1993-01-01

    We present a framework of algorithms for the solution of continuous optimization and variational inequality problems. In the general algorithm, a search direction finding auxiliary problems is obtained by replacing the original cost function with an approximating monotone cost function. The proposed framework encompasses algorithm classes presented earlier by Cohen, Dafermos, Migdalas, and Tseng, and includes numerous descent and successive approximation type methods, such as Newton methods, Jacobi and Gauss-Siedel type decomposition methods for problems defined over Cartesian product sets, and proximal point methods, among others. The auxiliary problem of the general algorithm also induces equivalent optimization reformulation and descent methods for asymmetric variational inequalities. We study the convergence properties of the general algorithm when applied to unconstrained optimization, nondifferentiable optimization, constrained differentiable optimization, and variational inequalities; the emphasis of the convergence analyses is placed on basic convergence results, convergence using different line search strategies and truncated subproblem solutions, and convergence rate results. This analysis offer a unification of known results; moreover, it provides strengthenings of convergence results for many existing algorithms, and indicates possible improvements of their realizations. 482 refs

  12. Performance Analysis of Untraceability Protocols for Mobile Agents Using an Adaptable Framework

    OpenAIRE

    LESZCZYNA RAFAL; GORSKI Janusz Kazimierz

    2006-01-01

    Recently we had proposed two untraceability protocols for mobile agents and began investigating their quality. We believe that quality evaluation of security protocols should extend a sole validation of their security and cover other quality aspects, primarily their efficiency. Thus after conducting a security analysis, we wanted to complement it with a performance analysis. For this purpose we developed a performance evaluation framework, which, as we realised, with certain adjustments, can ...

  13. OpenElectrophy: an electrophysiological data- and analysis-sharing framework

    Directory of Open Access Journals (Sweden)

    Samuel Garcia

    2009-05-01

    Full Text Available Progress in experimental tools and design is allowing the acquisition of increasingly large datasets. Storage, manipulation and efficient analyses of such large amounts of data is now a primary issue. We present OpenElectrophy, an electrophysiological data and analysis sharing framework developed to fill this niche. It stores all experiment data and meta-data in a single central MySQL database, and provides a graphic user interface to visualize and explore the data, and a library of functions for user analysis scripting in Python. It implements multiple spike sorting methods, and oscillation detection based on the ridge extraction methods due to Roux et. al., 2007. OpenElectrophy is open-source and is freely available for download at http://neuralensemble.org/trac/OpenElectrophy.

  14. From fatalism to mitigation: a conceptual framework for mitigating fetal programming of chronic disease by maternal obesity

    OpenAIRE

    Boone-Heinonen, Janne; Messer, Lynne C.; Fortmann, Stephen P.; Wallack, Lawrence; Thornburg, Kent L.

    2015-01-01

    Prenatal development is recognized as a critical period in the etiology of obesity and cardiometabolic disease. Potential strategies to reduce maternal obesity-induced risk later in life have been largely overlooked. In this paper, we first propose a conceptual framework for the role of public health and preventive medicine in mitigating the effects of fetal programming. Second, we review a small but growing body of research (through August 2015) that examines interactive effects of maternal ...

  15. Program Analysis Scenarios in Rascal

    NARCIS (Netherlands)

    M.A. Hills (Mark); P. Klint (Paul); J.J. Vinju (Jurgen); F. Durán

    2012-01-01

    textabstractRascal is a meta programming language focused on the implementation of domain-specific languages and on the rapid construction of tools for software analysis and software transformation. In this paper we focus on the use of Rascal for software analysis. We illustrate a range of scenarios

  16. BEX Mejora continua framework

    OpenAIRE

    García Ramírez, David

    2014-01-01

    Memoria de la implementación de un software que permite la gestión y control de todo el framework que requiere gestionar el departamento de mejora continua (BEX Business Excelence). Memòria de la implementació d'un programari que permet la gestió i control de tot el framework que requereix gestionar el departament de millora contínua (BEX Business Excelence). Master thesis for the Free Software program.

  17. The SAFE FOODS Risk Analysis Framework suitable for GMOs? A case study

    NARCIS (Netherlands)

    Kuiper, H.A.; Davies, H.V.

    2010-01-01

    This paper describes the current EU regulatory framework for risk analysis of genetically modified (GM) crop cultivation and market introduction of derived food/feed. Furthermore the risk assessment strategies for GM crops and derived food/feed as designed by the European Food Safety Authority

  18. International Review of Frameworks for Standard Setting & Labeling Development

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Khanna, Nina Zheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fridley, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Romankiewicz, John [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-09-01

    As appliance energy efficiency standards and labeling (S&L) programs reach a broader geographic and product scope, a series of sophisticated and complex technical and economic analyses have been adopted by different countries in the world to support and enhance these growing S&L programs. The initial supporting techno-economic and impact analyses for S&L development make up a defined framework and process for setting and developing appropriate appliance efficiency standards and labeling programs. This report reviews in-depth the existing framework for standards setting and label development in the well-established programs of the U.S., Australia and the EU to identify and evaluate major trends in how and why key analyses are undertaken and to understand major similarities and differences between each of the frameworks.

  19. Using a Strategic Planning Tool as a Framework for Case Analysis

    Science.gov (United States)

    Lai, Christine A.; Rivera, Julio C., Jr.

    2006-01-01

    In this article, the authors describe how they use a strategic planning tool known as SWOT as a framework for case analysis, using it to analyze the strengths, weaknesses, opportunities, and threats of a public works project intended to enhance regional economic development in Tempe, Arizona. Students consider the project in light of a variety of…

  20. Matlab programming for numerical analysis

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. Programming MATLAB for Numerical Analysis introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. You will first become

  1. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    International Nuclear Information System (INIS)

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-01-01

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs

  2. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu [Louisiana State University, Baton Rouge, LA (United States); Sattler, Meredith, E-mail: msattler@lsu.edu [School of Architecture, Louisiana State University, Baton Rouge, LA (United States); Friedland, Carol J., E-mail: friedland@lsu.edu [Bert S. Turner Department of Construction Management, Louisiana State University, Baton Rouge, LA (United States)

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  3. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  4. A framework for 2-stage global sensitivity analysis of GastroPlus™ compartmental models.

    Science.gov (United States)

    Scherholz, Megerle L; Forder, James; Androulakis, Ioannis P

    2018-04-01

    Parameter sensitivity and uncertainty analysis for physiologically based pharmacokinetic (PBPK) models are becoming an important consideration for regulatory submissions, requiring further evaluation to establish the need for global sensitivity analysis. To demonstrate the benefits of an extensive analysis, global sensitivity was implemented for the GastroPlus™ model, a well-known commercially available platform, using four example drugs: acetaminophen, risperidone, atenolol, and furosemide. The capabilities of GastroPlus were expanded by developing an integrated framework to automate the GastroPlus graphical user interface with AutoIt and for execution of the sensitivity analysis in MATLAB ® . Global sensitivity analysis was performed in two stages using the Morris method to screen over 50 parameters for significant factors followed by quantitative assessment of variability using Sobol's sensitivity analysis. The 2-staged approach significantly reduced computational cost for the larger model without sacrificing interpretation of model behavior, showing that the sensitivity results were well aligned with the biopharmaceutical classification system. Both methods detected nonlinearities and parameter interactions that would have otherwise been missed by local approaches. Future work includes further exploration of how the input domain influences the calculated global sensitivity measures as well as extending the framework to consider a whole-body PBPK model.

  5. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    Science.gov (United States)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2011-06-01

    A new stable version ("production version") v5.28.00 of ROOT [1] has been published [2]. It features several major improvements in many areas, most noteworthy data storage performance as well as statistics and graphics features. Some of these improvements have already been predicted in the original publication Antcheva et al. (2009) [3]. This version will be maintained for at least 6 months; new minor revisions ("patch releases") will be published [4] to solve problems reported with this version. New version program summaryProgram title: ROOT Catalogue identifier: AEFA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Lesser Public License v.2.1 No. of lines in distributed program, including test data, etc.: 2 934 693 No. of bytes in distributed program, including test data, etc.: 1009 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista/7, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM: > 55 Mbytes Classification: 4, 9, 11.9, 14 Catalogue identifier of previous version: AEFA_v1_0 Journal reference of previous version: Comput. Phys. Commun. 180 (2009) 2499 Does the new version supersede the previous version?: Yes Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Reasons for new version: Added features and corrections of deficiencies Summary of revisions: The release notes at http://root.cern.ch/root/v528/Version528.news.html give a module-oriented overview of the changes in v5.28.00. Highlights include File format Reading of TTrees has been improved dramatically with respect to CPU time (30%) and notably with respect to disk space. Histograms A

  6. Overview of the Systems Analysis Framework for the EU Bioeconomy. Deliverable 1.4 of the EU FP 7 SAT-BBE project Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy (SAT BBE)

    NARCIS (Netherlands)

    Leeuwen, van M.G.A.; Meijl, van H.; Smeets, E.M.W.; Tabeau-Kowalska, E.W.

    2014-01-01

    In November 2012 the Systems Analysis Tools Framework for the EU Bio-Based Economy Strategy project (SAT-BBE) was launched with the purpose to design an analysis tool useful to monitoring the evolution and impacts of the bioeconomy. In the SAT-BBE project the development of the analysis tool for the

  7. The FairRoot framework

    International Nuclear Information System (INIS)

    Al-Turany, M; Bertini, D; Karabowicz, R; Kresan, D; Malzacher, P; Uhlig, F; Stockmanns, T

    2012-01-01

    The FairRoot framework is an object oriented simulation, reconstruction and data analysis framework based on ROOT. It includes core services for detector simulation and offline analysis. The framework delivers base classes which enable the users to easily construct their experimental setup in a fast and convenient way. By using the Virtual Monte Carlo concept it is possible to perform the simulations using either Geant3 or Geant4 without changing the user code or the geometry description. Using and extending the task mechanism of ROOT it is possible to implement complex analysis tasks in a convenient way. Moreover, using the FairCuda interface of the framework it is possible to run some of these tasks also on GPU. Data IO, as well as parameter handling and data base connections are also handled by the framework. Since some of the experiments will not have an experimental setup with a conventional trigger system, the framework can handle also free flowing input streams of detector data. For this mode of operation the framework provides classes to create the needed time sorted input streams of detector data out of the event based simulation data. There are also tools to do radiation studies and to visualize the simulated data. A CMake-CDash based building and monitoring system is also part of the FairRoot services which helps to build and test the framework on many different platforms in an automatic way, including also Continuous Integration.

  8. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    Science.gov (United States)

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  9. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    Science.gov (United States)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  10. Seismic analysis program group: SSAP

    International Nuclear Information System (INIS)

    Uchida, Masaaki

    2002-05-01

    A group of programs SSAP has been developed, each member of which performs seismic calculation using simple single-mass system model or multi-mass system model. For response of structures to a transverse s-wave, a single-mass model program calculating response spectrum and a multi-mass model program are available. They perform calculation using the output of another program, which produces simulated earthquakes having the so-called Ohsaki-spectrum characteristic. Another program has been added, which calculates the response of one-dimensional multi-mass systems to vertical p-wave input. It places particular emphasis on the analysis of the phenomena observed at some shallow earthquakes in which stones jump off the ground. Through a series of test calculations using these programs, some interesting information has been derived concerning the validity of superimposing single-mass model calculation, and also the condition for stones to jump. (author)

  11. ATLAS Future Framework Requirements Group Report

    CERN Document Server

    The ATLAS collaboration

    2016-01-01

    The Future Frameworks Requirements Group was constituted in Summer 2013 to consider and summarise the framework requirements from trigger and offline for configuring, scheduling and monitoring the data processing software needed by the ATLAS experiment. The principal motivation for such a re-examination arises from the current and anticipated evolution of CPUs, where multiple cores, hyper-threading and wide vector registers require a shift to a concurrent programming model. Such a model requires extensive changes in the current Gaudi/Athena frameworks and offers the opportunity to consider how HLT and offline processing can be better accommodated within the ATLAS framework. This note contains the report of the Future Frameworks Requirements Group.

  12. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    and approaches which have been developed or proposed by large organizations or regulatory bodies for NM. These frameworks and approaches were evaluated and assessed based on a select number of criteria which have been previously proposed as important parameters for inclusion in successful risk assessment......7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...

  13. An integrated framework for cost- benefit analysis in road safety projects using AHP method

    Directory of Open Access Journals (Sweden)

    Mahsa Mohamadian

    2011-10-01

    Full Text Available Cost benefit analysis (CBA is a useful tool for investment decision-making from economic point of view. When the decision involves conflicting goals, the multi-attribute analysis approach is more capable; because there are some social and environmental criteria that cannot be valued or monetized by cost benefit analysis. The complex nature of decision-making in road safety normally makes it difficult to reach a single alternative solution that can satisfy all decision-making problems. Generally, the application of multi-attribute analysis in road sector is promising; however, the applications are in preliminary stage. Some multi-attribute analysis techniques, such as analytic hierarchy process (AHP have been widely used in practice. This paper presents an integrated framework with CBA and AHP methods to select proper alternative in road safety projects. The proposed model of this paper is implemented for a case study of improving a road to reduce the accidents in Iran. The framework is used as an aid to cost benefit tool in road safety projects.

  14. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe

    2014-06-06

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation, and the phase-field crystal equation as test cases. These two models allow us to highlight some of the main advantages that we have access to while using PetIGA for scientific computing.

  15. RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,

    Science.gov (United States)

    This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)

  16. A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure

    Directory of Open Access Journals (Sweden)

    Yingjie Xia

    2013-01-01

    Full Text Available Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs, which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI, by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.

  17. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis

  18. The EMBL-EBI bioinformatics web and programmatic tools framework.

    Science.gov (United States)

    Li, Weizhong; Cowley, Andrew; Uludag, Mahmut; Gur, Tamer; McWilliam, Hamish; Squizzato, Silvano; Park, Young Mi; Buso, Nicola; Lopez, Rodrigo

    2015-07-01

    Since 2009 the EMBL-EBI Job Dispatcher framework has provided free access to a range of mainstream sequence analysis applications. These include sequence similarity search services (https://www.ebi.ac.uk/Tools/sss/) such as BLAST, FASTA and PSI-Search, multiple sequence alignment tools (https://www.ebi.ac.uk/Tools/msa/) such as Clustal Omega, MAFFT and T-Coffee, and other sequence analysis tools (https://www.ebi.ac.uk/Tools/pfa/) such as InterProScan. Through these services users can search mainstream sequence databases such as ENA, UniProt and Ensembl Genomes, utilising a uniform web interface or systematically through Web Services interfaces (https://www.ebi.ac.uk/Tools/webservices/) using common programming languages, and obtain enriched results with novel visualisations. Integration with EBI Search (https://www.ebi.ac.uk/ebisearch/) and the dbfetch retrieval service (https://www.ebi.ac.uk/Tools/dbfetch/) further expands the usefulness of the framework. New tools and updates such as NCBI BLAST+, InterProScan 5 and PfamScan, new categories such as RNA analysis tools (https://www.ebi.ac.uk/Tools/rna/), new databases such as ENA non-coding, WormBase ParaSite, Pfam and Rfam, and new workflow methods, together with the retirement of depreciated services, ensure that the framework remains relevant to today's biological community. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Open Issues in Object-Oriented Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    1995-01-01

    We discuss a number of open issues within object-oriented programming. The central mechanisms of object-oriented programming appeared with Simula, developed more than 30 years ago; these include class, subclass, virtual function, active object and the first application framework, Class Simulation....... The core parts of object-oriented programming should be well understood, but there are still a large number of issues where there is no consensus. The term object-orientation has been applied to many subjects, such as analysis, design implementation, data modeling in databases, and distribution...

  20. Effective Analysis of C Programs by Rewriting Variability

    DEFF Research Database (Denmark)

    Iosif-Lazar, Alexandru Florin; Melo, Jean; Dimovski, Aleksandar

    2017-01-01

    and effective analysis and verification of real-world C program families. Importance. We report some interesting variability-related bugs that we discovered using various state-of-the-art single-program C verification tools, such as Frama-C, Clang, LLBMC.......Context. Variability-intensive programs (program families) appear in many application areas and for many reasons today. Different family members, called variants, are derived by switching statically configurable options (features) on and off, while reuse of the common code is maximized. Inquiry....... Verification of program families is challenging since the number of variants is exponential in the number of features. Existing single-program analysis and verification tools cannot be applied directly to program families, and designing and implementing the corresponding variability-aware versions is tedious...

  1. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  2. Cost-effectiveness analysis for the implementation of the EU Water Framework Directive

    NARCIS (Netherlands)

    van Engelen, D.M.; Seidelin, Christian; van der Veeren, Rob; Barton, David N.; Queb, Kabir

    2008-01-01

    The EU Water Framework Directive (WFD) prescribes cost-effectiveness analysis (CEA) as an economic tool for the minimisation of costs when formulating programmes of measures to be implemented in the European river basins by the year 2009. The WFD does not specify, however, which approach to CEA has

  3. A Comparative Analysis of PISA Scientific Literacy Framework in Finnish and Thai Science Curricula

    Science.gov (United States)

    Sothayapetch, Pavinee; Lavonen, Jari; Juuti, Kalle

    2013-01-01

    A curriculum is a master plan that regulates teaching and learning. This paper compares Finnish and Thai primary school level science curricula to the PISA 2006 Scientific Literacy Framework. Curriculum comparison was made following the procedure of deductive content analysis. In the analysis, there were four main categories adopted from PISA…

  4. Critical asset and portfolio risk analysis: an all-hazards framework.

    Science.gov (United States)

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  5. R data analysis without programming

    CERN Document Server

    Gerbing, David W

    2013-01-01

    This book prepares readers to analyze data and interpret statistical results using R more quickly than other texts. R is a challenging program to learn because code must be created to get started. To alleviate that challenge, Professor Gerbing developed lessR. LessR extensions remove the need to program. By introducing R through less R, readers learn how to organize data for analysis, read the data into R, and produce output without performing numerous functions and programming exercises first. With lessR, readers can select the necessary procedure and change the relevant variables without pro

  6. The stapl Skeleton Framework

    KAUST Repository

    Zandifar, Mani; Thomas, Nathan; Amato, Nancy M.; Rauchwerger, Lawrence

    2015-01-01

    from programmers and enables them to express parallel programs as a composition of existing elementary skeletons such as map, map-reduce, scan, zip, butterfly, allreduce, alltoall and user-defined custom skeletons. Skeletons in this framework

  7. Probabilistic Resource Analysis by Program Transformation

    DEFF Research Database (Denmark)

    Kirkeby, Maja Hanne; Rosendahl, Mads

    2016-01-01

    The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...

  8. Programming Social Applications Building Viral Experiences with OpenSocial, OAuth, OpenID, and Distributed Web Frameworks

    CERN Document Server

    LeBlanc, Jonathan

    2011-01-01

    Social networking has made one thing clear: websites and applications need to provide users with experiences tailored to their preferences. This in-depth guide shows you how to build rich social frameworks, using open source technologies and specifications. You'll learn how to create third-party applications for existing sites, build engaging social graphs, and develop products to host your own socialized experience. Programming Social Apps focuses on the OpenSocial platform, along with Apache Shindig, OAuth, OpenID, and other tools, demonstrating how they work together to help you solve pra

  9. Concepts of person-centred care: a framework analysis of five studies in daily care practices

    Directory of Open Access Journals (Sweden)

    Margreet

    2016-11-01

    Full Text Available Background: Person-centred care is used as a term to indicate a ‘made to measure’ approach in care. But what does this look like in daily practice? The person-centred nursing framework developed by McCormack and McCance (2010 offers specific concepts but these are still described in rather general terms. Empirical studies, therefore, could help to clarify them and make person-centredness more tangible for nurses. Aims: This paper describes how a framework analysis aimed to clarify the concepts described in the model of McCormack and McCance in order to guide professionals using them in practice. Methods: Five separate empirical studies focusing on older adults in the Netherlands were used in the framework analysis. The research question was: ‘How are concepts of person-centred care made tangible where empirical data are used to describe them?’ Analysis was done in five steps, leading to a comparison between the description of the concepts and the empirical significance found in the studies. Findings: Suitable illustrations were found for the majority of concepts. The results show that an empirically derived specification emerges from the data. In the concept of ‘caring relationship’ for example, it is shown that the personal character of each relationship is expressed by what the nurse and the older person know about each other. Other findings show the importance of values being present in care practices. Conclusions: The framework analysis shows that concepts can be clarified when empirical studies are used to make person-centred care tangible so nurses can understand and apply it in practice. Implications for practice: The concepts of the person-centred nursing framework are recognised when: Nurses know unique characteristics of the person they care for and what is important to them, and act accordingly Nurses use values such as trust, involvement and humour in their care practice Acknowledgement of emotions and compassion create

  10. A framework for the organization and delivery of systemic treatment.

    Science.gov (United States)

    Vandenberg, T; Coakley, N; Nayler, J; Degrasse, C; Green, E; Mackay, J A; McLennan, C; Smith, A; Wilcock, L; Trudeau, M E

    2009-01-01

    Increasing systemic treatment and shortages of oncology professionals in Canada require innovative approaches to the safe and effective delivery of intravenous (IV) cancer treatment. We conducted a systematic review of the clinical and scientific literature, and an environmental scan of models in Canada, the United Kingdom, Australia, and New Zealand. We then developed a framework for the organization and delivery of IV systemic treatment. The systematic review covered the medline, embase, cinahl, and HealthStar databases. The environmental scan retrieved published and unpublished sources, coupled with a free key word search using the Google search engine. The Systemic Treatment Working Group reviewed the evidence and developed a draft framework using evidence-based analysis, existing recommendations from various jurisdictions, and expert opinion based on experience and consensus. The draft was assessed by Ontario stakeholders and reviewed and approved by Cancer Care Ontario. The poor quantity and quality of the evidence necessitated a consensus-derived model. That model comprises four levels of care determined by a regional systemic treatment program and three integrated structures (integrated cancer programs, affiliate institutions, and satellite institutions), each with a defined scope of practice and a specific organizational framework. New models of care are urgently required beyond large centres, particularly in geographically remote or rural areas. Despite limited applicable evidence, the development and successful implementation of this framework is intended to create sustainable, accessible, quality care and to measurably improve patient outcomes.

  11. A decision analysis framework to support long-term planning for nuclear fuel cycle technology research, development, demonstration and deployment

    International Nuclear Information System (INIS)

    Sowder, A.G.; Machiels, A.J.; Dykes, A.A.; Johnson, D.H.

    2013-01-01

    To address challenges and gaps in nuclear fuel cycle option assessment and to support research, develop and demonstration programs oriented toward commercial deployment, EPRI (Electric Power Research Institute) is seeking to develop and maintain an independent analysis and assessment capability by building a suite of assessment tools based on a platform of software, simplified relationships, and explicit decision-making and evaluation guidelines. As a demonstration of the decision-support framework, EPRI examines a relatively near-term fuel cycle option, i.e., use of reactor-grade mixed-oxide fuel (MOX) in U.S. light water reactors. The results appear as a list of significant concerns (like cooling of spent fuels, criticality risk...) that have to be taken into account for the final decision

  12. Comparability of outcome frameworks in medical education: Implications for framework development.

    Science.gov (United States)

    Hautz, Stefanie C; Hautz, Wolf E; Feufel, Markus A; Spies, Claudia D

    2015-01-01

    Given the increasing mobility of medical students and practitioners, there is a growing need for harmonization of medical education and qualifications. Although several initiatives have sought to compare national outcome frameworks, this task has proven a challenge. Drawing on an analysis of existing outcome frameworks, we identify factors that hinder comparability and suggest ways of facilitating comparability during framework development and revisions. We searched MedLine, EmBase and the Internet for outcome frameworks in medical education published by national or governmental organizations. We analyzed these frameworks for differences and similarities that influence comparability. Of 1816 search results, 13 outcome frameworks met our inclusion criteria. These frameworks differ in five core features: history and origins, formal structure, medical education system, target audience and key terms. Many frameworks reference other frameworks without acknowledging these differences. Importantly, the level of detail of the outcomes specified differs both within and between frameworks. The differences identified explain some of the challenges involved in comparing outcome frameworks and medical qualifications. We propose a two-level model distinguishing between "core" competencies and culture-specific "secondary" competencies. This approach could strike a balance between local specifics and cross-national comparability of outcome frameworks and medical education.

  13. 76 FR 1440 - Notice of Revised Child Outcomes Framework

    Science.gov (United States)

    2011-01-10

    ... Outcomes Framework, renamed The Head Start Child Development and Learning Framework: Promoting Positive Outcomes in Early Childhood Programs Serving Children 3-5 Years Old. The Framework was revised to give more... DEPARTMENT OF HEALTH AND HUMAN SERVICES Administration for Children and Families Notice of Revised...

  14. Using the RE-AIM Framework in formative evaluation and program planning for a nutrition intervention in the Lower Mississippi Delta.

    Science.gov (United States)

    Huye, Holly F; Connell, Carol L; Crook, LaShaundrea B; Yadrick, Kathy; Zoellner, Jamie

    2014-01-01

    Identification of prominent themes to be considered when planning a nutrition intervention using the Reach, Effectiveness, Adoption, Implementation, and Maintenance framework. Qualitative formative research. Women's social and civic organizations in the Lower Mississippi Delta. Thirty-seven (5 white and 32 black) women with a college degree or higher. Impact of dietary and contextual factors related to the Lower Mississippi Delta culture on intervention planning. Case analysis strategy using question-by-question coding. Major themes that emerged were "healthy eating focus" and "promoting a healthy lifestyle" when recruiting organizations (Reach); "positive health changes" as a result of the intervention (Effectiveness); "logistics: time commitment, location, and schedule" to initiate a program (Adoption); "expense of healthy foods" and "cooking and meal planning" as barriers to participation (Implementation); and "resources and training" and "motivation" as necessary for program continuation (Maintenance). The "health of the Delta" theme was found across all dimensions, which reflected participants' compassion for their community. Results were used to develop an implementation plan promoting optimal reach, effectiveness, adoption, implementation, and maintenance of a nutrition intervention. This research emphasizes the benefits of formative research using a systematic process at organizational and individual levels. Copyright © 2014 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  15. Exploring intellectual capital through social network analysis: a conceptual framework

    Directory of Open Access Journals (Sweden)

    Ivana Tichá

    2011-01-01

    Full Text Available The purpose of this paper is to develop a framework to assess intellectual capital. Intellectual capital is a key element in an organization’s future earning potential. Theoretical and empirical studies show that it is the unique combination of the different elements of intellectual capital and tangible investments that determines an enterprise´s competitive advantage. Intellectual capital has been defined as the combination of an organization´s human, organizational and relational resources and activities. It includes the knowledge, skills, experience and abilities of the employees, its R&D activities, organizational, routines, procedures, systems, databases and its Intellectual Property Rights, as well as all the resources linked to its external relationships, such as with its customers, suppliers, R&D partners, etc. This paper focuses on the relational capital and attempts to suggest a conceptual framework to assess this part of intellectual capital applying social network analysis approach. The SNA approach allows for mapping and measuring of relationships and flows between, people, groups, organizations, computers, URLs, and other connected information/knowledge entities. The conceptual framework is developed for the assessment of collaborative networks in the Czech higher education sector as the representation of its relational capital. It also builds on the previous work aiming at proposal of methodology guiding efforts to report intellectual capital at the Czech public universities.

  16. Towards Interactive Visual Exploration of Parallel Programs using a Domain-Specific Language

    KAUST Repository

    Klein, Tobias; Bruckner, Stefan; Grö ller, M. Eduard; Hadwiger, Markus; Rautek, Peter

    2016-01-01

    The use of GPUs and the massively parallel computing paradigm have become wide-spread. We describe a framework for the interactive visualization and visual analysis of the run-time behavior of massively parallel programs, especially OpenCL kernels. This facilitates understanding a program's function and structure, finding the causes of possible slowdowns, locating program bugs, and interactively exploring and visually comparing different code variants in order to improve performance and correctness. Our approach enables very specific, user-centered analysis, both in terms of the recording of the run-time behavior and the visualization itself. Instead of having to manually write instrumented code to record data, simple code annotations tell the source-to-source compiler which code instrumentation to generate automatically. The visualization part of our framework then enables the interactive analysis of kernel run-time behavior in a way that can be very specific to a particular problem or optimization goal, such as analyzing the causes of memory bank conflicts or understanding an entire parallel algorithm.

  17. Multiply controlled verbal operants: An analysis and extension to the picture exchange communication system

    OpenAIRE

    Bondy, Andy; Tincani, Matt; Frost, Lori

    2004-01-01

    This paper presents Skinner's (1957) analysis of verbal behavior as a framework for understanding language acquisition in children with autism. We describe Skinner's analysis of pure and impure verbal operants and illustrate how this analysis may be applied to the design of communication training programs. The picture exchange communication system (PECS) is a training program influenced by Skinner's framework. We describe the training sequence associated with PECS and illustrate how this sequ...

  18. BioQueue: a novel pipeline framework to accelerate bioinformatics analysis.

    Science.gov (United States)

    Yao, Li; Wang, Heming; Song, Yuanyuan; Sui, Guangchao

    2017-10-15

    With the rapid development of Next-Generation Sequencing, a large amount of data is now available for bioinformatics research. Meanwhile, the presence of many pipeline frameworks makes it possible to analyse these data. However, these tools concentrate mainly on their syntax and design paradigms, and dispatch jobs based on users' experience about the resources needed by the execution of a certain step in a protocol. As a result, it is difficult for these tools to maximize the potential of computing resources, and avoid errors caused by overload, such as memory overflow. Here, we have developed BioQueue, a web-based framework that contains a checkpoint before each step to automatically estimate the system resources (CPU, memory and disk) needed by the step and then dispatch jobs accordingly. BioQueue possesses a shell command-like syntax instead of implementing a new script language, which means most biologists without computer programming background can access the efficient queue system with ease. BioQueue is freely available at https://github.com/liyao001/BioQueue. The extensive documentation can be found at http://bioqueue.readthedocs.io. li_yao@outlook.com or gcsui@nefu.edu.cn. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. Work program analysis - defining the capability/risk plan

    International Nuclear Information System (INIS)

    Hrinivich, W.A.

    2004-01-01

    Bruce Power has developed and implemented an analysis methodology (Work Program Analysis) to assess and address corporate business risk associated with work group capability. Work Program Analysis is proving to be an excellent tool for identifying and supporting key business decisions facing the line and senior management at Bruce Power. The following describes the methodology, its application and the results achieved. (author)

  20. BioInt: an integrative biological object-oriented application framework and interpreter.

    Science.gov (United States)

    Desai, Sanket; Burra, Prasad

    2015-01-01

    BioInt, a biological programming application framework and interpreter, is an attempt to equip the researchers with seamless integration, efficient extraction and effortless analysis of the data from various biological databases and algorithms. Based on the type of biological data, algorithms and related functionalities, a biology-specific framework was developed which has nine modules. The modules are a compilation of numerous reusable BioADTs. This software ecosystem containing more than 450 biological objects underneath the interpreter makes it flexible, integrative and comprehensive. Similar to Python, BioInt eliminates the compilation and linking steps cutting the time significantly. The researcher can write the scripts using available BioADTs (following C++ syntax) and execute them interactively or use as a command line application. It has features that enable automation, extension of the framework with new/external BioADTs/libraries and deployment of complex work flows.

  1. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  2. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  3. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    Science.gov (United States)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  4. An Evaluation Use Framework and Empirical Assessment

    Science.gov (United States)

    Peck, Laura R.; Gorzalski, Lindsey M.

    2009-01-01

    Background: Research on evaluation use focuses on putting evaluation recommendations into practice. Prior theoretical research proposes varied frameworks for understanding the use (or lack) of program evaluation results. Purpose: Our purpose is to create and test a single, integrated framework for understanding evaluation use. This article relies…

  5. A Conceptual Framework for Primary Source Practices

    Science.gov (United States)

    Ensminger, David C.; Fry, Michelle L.

    2012-01-01

    This article introduces a descriptive conceptual framework to provide teachers with a means of recognizing and describing instructional activities that use primary sources. The framework provides structure for professional development programs that have been established to train teachers to access and integrate primary sources into lessons. The…

  6. A Flexible Framework for Magnetic Measurements

    CERN Document Server

    Inglese, V; Buzio, M

    2009-01-01

    The work presented in this Ph.D. thesis covers the specification, design, prototyping, and validation of a new version of a magnetic measurement control, acquisition, and data analysis software package: the Flexible Framework for Magnetic Measurements (FFMM). FFMM constitutes the software part of the new platform for magnetic measurements, including also new high-performance hardware, developed at the European Organization for Nuclear Research (CERN) in cooperation with the Department of Engineering of the University of Sannio. FFMM is conceived as a unified solution to drive all the existing and future park of measurement systems (mainly magnetic but also optical, mechanical, etc.). The effort for the series test of the LHC superconducting magnets highlighted limitations in the measurement control and acquisition programs, mainly associated with the relatively long time needed for a development iteration (the cycle of specification-programming-debugging-validation). Moreover, the software capabilities needed...

  7. Adaptable Value-Set Analysis for Low-Level Code

    OpenAIRE

    Brauer, Jörg; Hansen, René Rydhof; Kowalewski, Stefan; Larsen, Kim G.; Olesen, Mads Chr.

    2012-01-01

    This paper presents a framework for binary code analysis that uses only SAT-based algorithms. Within the framework, incremental SAT solving is used to perform a form of weakly relational value-set analysis in a novel way, connecting the expressiveness of the value sets to computational complexity. Another key feature of our framework is that it translates the semantics of binary code into an intermediate representation. This allows for a straightforward translation of the program semantics in...

  8. Critical evaluation of international health programs: Reframing global health and evaluation.

    Science.gov (United States)

    Chi, Chunhuei; Tuepker, Anaïs; Schoon, Rebecca; Núñez Mondaca, Alicia

    2018-01-05

    Striking changes in the funding and implementation of international health programs in recent decades have stimulated debate about the role of communities in deciding which health programs to implement. An important yet neglected piece of that discussion is the need to change norms in program evaluation so that analysis of community ownership, beyond various degrees of "participation," is seen as central to strong evaluation practices. This article challenges mainstream evaluation practices and proposes a framework of Critical Evaluation with 3 levels: upstream evaluation assessing the "who" and "how" of programming decisions; midstream evaluation focusing on the "who" and "how" of selecting program objectives; and downstream evaluation, the focus of current mainstream evaluation, which assesses whether the program achieved its stated objectives. A vital tenet of our framework is that a community possesses the right to determine the path of its health development. A prerequisite of success, regardless of technical outcomes, is that programs must address communities' high priority concerns. Current participatory methods still seldom practice community ownership of program selection because they are vulnerable to funding agencies' predetermined priorities. In addition to critiquing evaluation practices and proposing an alternative framework, we acknowledge likely challenges and propose directions for future research. Copyright © 2018 John Wiley & Sons, Ltd.

  9. DXC'11 Framework and Oracle

    Data.gov (United States)

    National Aeronautics and Space Administration — The DXC Framework is a collection of programs and APIs for running and evaluating diagnostic algorithms (DAs) under identical experimental conditions. It is...

  10. DXC'10 Framework and Oracle

    Data.gov (United States)

    National Aeronautics and Space Administration — The DXC Framework is a collection of programs and APIs for running and evaluating diagnostic algorithms (DAs) under identical experimental conditions. It is...

  11. Diversity training for the community aged care workers: A conceptual framework for evaluation.

    Science.gov (United States)

    Appannah, Arti; Meyer, Claudia; Ogrin, Rajna; McMillan, Sally; Barrett, Elizabeth; Browning, Colette

    2017-08-01

    Older Australians are an increasingly diverse population, with variable characteristics such as culture, sexual orientation, socioeconomic status, and physical capabilities potentially influencing their participation in healthcare. In response, community aged care workers may need to increase skills and uptake of knowledge into practice regarding diversity through appropriate training interventions. Diversity training (DT) programs have traditionally existed in the realm of business, with little research attention devoted to scientifically evaluating the outcomes of training directed at community aged care workers. A DT workshop has been developed for community aged care workers, and this paper focuses on the construction of a formative evaluative framework for the workshop. Key evaluation concepts and measures relating to DT have been identified in the literature and integrated into the framework, focusing on five categories: Training needs analysis; Reactions; Learning outcomes, Behavioural outcomes and Results The use of a mixed methods approach in the framework provides an additional strength, by evaluating long-term behavioural change and improvements in service delivery. As little is known about the effectiveness of DT programs for community aged care workers, the proposed framework will provide an empirical and consistent method of evaluation, to assess their impact on enhancing older people's experience of healthcare. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Communicative automata based programming. Society Framework

    Directory of Open Access Journals (Sweden)

    Andrei Micu

    2015-10-01

    Full Text Available One of the aims of this paper is to present a new programming paradigm based on the new paradigms intensively used in IT industry. Implementation of these techniques can improve the quality of code through modularization, not only in terms of entities used by a program, but also in terms of states in which they pass. Another aspect followed in this paper takes into account that in the development of software applications, the transition from the design to the source code is a very expensive step in terms of effort and time spent. Diagrams can hide very important details for simplicity of understanding, which can lead to incorrect or incomplete implementations. To improve this process communicative automaton based programming comes with an intermediate step. We will see how it goes after creating modeling diagrams to communicative automata and then to writing code for each of them. We show how the transition from one step to another is much easier and intuitive.

  13. Analysis and Implement of Broadcast Program Monitoring Data

    Directory of Open Access Journals (Sweden)

    Song Jin Bao

    2016-01-01

    Full Text Available With the rapid development of the radio and TV industry and the implementation of INT (the integration of telecommunications networks, cable TV networks and the Internet, the contents of programs and advertisements is showing massive, live and interactive trends. In order to meet the security of radio and television, the broadcast of information have to be controlled and administered. In order to master the latest information of public opinion trends through radio and television network, it is necessary research the specific industry applications of broadcast program monitoring. In this paper, the importance of broadcast monitoring in public opinion analysis is firstly analysed. The monitoring radio and television programs broadcast system architecture is proposed combining with the practice, focusing on the technical requirements and implementation process of program broadcast, advertisement broadcast and TV station broadcast monitoring. The more efficient information is generated through statistical analysis, which provides data analysis for radio and television public opinion analysis.

  14. Professional Development and Use of Digital Technologies by Science Teachers: a Review of Theoretical Frameworks

    Science.gov (United States)

    Fernandes, Geraldo W. Rocha; Rodrigues, António M.; Ferreira, Carlos Alberto

    2018-03-01

    This article aims to characterise the research on science teachers' professional development programs that support the use of Information and Communication Technologies (ICTs) and the main trends concerning the theoretical frameworks (theoretical foundation, literature review or background) that underpin these studies. Through a systematic review of the literature, 76 articles were found and divided into two axes on training science teachers and the use of digital technologies with their categories. The first axis (characterisation of articles) presents the category key features that characterise the articles selected (major subjects, training and actions for the professional development and major ICT tools and digital resources). The second axis (trends of theoretical frameworks) has three categories organised in theoretical frameworks that emphasise the following: (a) the digital technologies, (b) prospects of curricular renewal and (c) cognitive processes. It also characterised a group of articles with theoretical frameworks that contain multiple elements without deepening them or that even lack a theoretical framework that supports the studies. In this review, we found that many professional development programs for teachers still use inadequate strategies for bringing about change in teacher practices. New professional development proposals are emerging with the objective of minimising such difficulties and this analysis could be a helpful tool to restructure those proposals.

  15. Leveraging Data Analysis for Domain Experts: An Embeddable Framework for Basic Data Science Tasks

    Science.gov (United States)

    Lohrer, Johannes-Y.; Kaltenthaler, Daniel; Kröger, Peer

    2016-01-01

    In this paper, we describe a framework for data analysis that can be embedded into a base application. Since it is important to analyze the data directly inside the application where the data is entered, a tool that allows the scientists to easily work with their data, supports and motivates the execution of further analysis of their data, which…

  16. Development of Performance Analysis Program for an Axial Compressor with Meanline Analysis

    International Nuclear Information System (INIS)

    Park, Jun Young; Park, Moo Ryong; Choi, Bum Suk; Song, Je Wook

    2009-01-01

    Axial-flow compressor is one of the most important parts of gas turbine units with axial turbine and combustor. Therefore, precise prediction of performance is very important for development of new compressor or modification of existing one. Meanline analysis is a simple, fast and powerful method for performance prediction of axial-flow compressors with different geometries. So, Meanline analysis is frequently used in preliminary design stage and performance analysis for given geometry data. Much correlations for meanline analysis have been developed theoretically and experimentally for estimating various types of losses and flow deviation angle for long time. In present study, meanline analysis program was developed to estimate compressor losses, incidence angles, deviation angles, stall and surge conditions with many correlations. Performance prediction of one stage axial compressors is conducted with this meanline analysis program. The comparison between experimental and numerical results show a good agreement. This meanline analysis program can be used for various types of single stage axial-flow compressors with different geometries, as well as multistage axial-flow compressors

  17. Program packages for dynamics systems analysis and design

    International Nuclear Information System (INIS)

    Athani, V.V.

    1976-01-01

    The development of computer program packages for dynamic system analysis and design are reported. The purpose of developing these program packages is to take the burden of writing computer programs off the mind of the system engineer and to enable him to concentrate on his main system analysis and design work. Towards this end, four standard computer program packages have been prepared : (1) TFANA - starting from system transfer function this program computes transient response, frequency response, root locus and stability by Routh Hurwitz criterion, (2) TFSYN - classical synthesis using algebraic method of Shipley, (3) MODANA - starting from state equations of the system this program computes solution of state equations, controllability, observability and stability, (4) OPTCON - This program obtains solutions of (i) linear regulator problem, (ii) servomechanism problems and (iii) problem of pole placement. The paper describes these program packages with the help of flowcharts and illustrates their use with the help of examples. (author)

  18. The Rocky Road to Change: Implications for Substance Abuse Programs on College Campuses.

    Science.gov (United States)

    Scott, Cynthia G.; Ambroson, DeAnn L.

    1994-01-01

    Examines college substance abuse prevention and intervention programs in the framework of the elaboration likelihood model. Discusses the role of persuasion and recommends careful analysis of the relevance, construction, and delivery of messages about substance use and subsequent program evaluation. Recommendations for increasing program…

  19. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  20. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  1. Collaborative Communication in Work Based Learning Programs

    Science.gov (United States)

    Wagner, Stephen Allen

    2017-01-01

    This basic qualitative study, using interviews and document analysis, examined reflections from a Work Based Learning (WBL) program to understand how utilizing digital collaborative communication tools influence the educational experience. The Community of Inquiry (CoI) framework was used as a theoretical frame promoting the examination of the…

  2. PGSB/MIPS PlantsDB Database Framework for the Integration and Analysis of Plant Genome Data.

    Science.gov (United States)

    Spannagl, Manuel; Nussbaumer, Thomas; Bader, Kai; Gundlach, Heidrun; Mayer, Klaus F X

    2017-01-01

    Plant Genome and Systems Biology (PGSB), formerly Munich Institute for Protein Sequences (MIPS) PlantsDB, is a database framework for the integration and analysis of plant genome data, developed and maintained for more than a decade now. Major components of that framework are genome databases and analysis resources focusing on individual (reference) genomes providing flexible and intuitive access to data. Another main focus is the integration of genomes from both model and crop plants to form a scaffold for comparative genomics, assisted by specialized tools such as the CrowsNest viewer to explore conserved gene order (synteny). Data exchange and integrated search functionality with/over many plant genome databases is provided within the transPLANT project.

  3. GRDC. A Collaborative Framework for Radiological Background and Contextual Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Quiter, Brian J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bandstra, Mark S. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-12-01

    The Radiation Mobile Analysis Platform (RadMAP) is unique in its capability to collect both high quality radiological data from both gamma-ray detectors and fast neutron detectors and a broad array of contextual data that includes positioning and stance data, high-resolution 3D radiological data from weather sensors, LiDAR, and visual and hyperspectral cameras. The datasets obtained from RadMAP are both voluminous and complex and require analyses from highly diverse communities within both the national laboratory and academic communities. Maintaining a high level of transparency will enable analysis products to further enrich the RadMAP dataset. It is in this spirit of open and collaborative data that the RadMAP team proposed to collect, calibrate, and make available online data from the RadMAP system. The Berkeley Data Cloud (BDC) is a cloud-based data management framework that enables web-based data browsing visualization, and connects curated datasets to custom workflows such that analysis products can be managed and disseminated while maintaining user access rights. BDC enables cloud-based analyses of large datasets in a manner that simulates real-time data collection, such that BDC can be used to test algorithm performance on real and source-injected datasets. Using the BDC framework, a subset of the RadMAP datasets have been disseminated via the Gamma Ray Data Cloud (GRDC) that is hosted through the National Energy Research Science Computing (NERSC) Center, enabling data access to over 40 users at 10 institutions.

  4. ECTA/DaSy Framework Self-Assessment Comparison Tool

    Science.gov (United States)

    Center for IDEA Early Childhood Data Systems (DaSy), 2016

    2016-01-01

    The Self-Assessment Comparison (SAC) Tool is for state Part C and Section 619/Preschool programs to use to assess changes in the implementation of one or more components of the ECTA System Framework and/or subcomponenets of the DaSy Data System Framework. It is a companion to the ECTA/DaSy Framework Self-Assessment. Key features of the SAC are…

  5. Projects and potentialities for Scientific and Technological Cooperation between Mexico and Thailand under the European Union's Seventh Framework Program (FP7: 2007-2013

    Directory of Open Access Journals (Sweden)

    Jürgen Haberleithner

    2012-09-01

    Full Text Available The European Union and Mexico have been cooperating in the field of R&D since the partnership treaty between the Eu and Mexico took effect in 2000. With the Lisbon Strategy put into operation that same year, Europe acknowledged the central role which will be played by knowledge in the economy and society of the future. Accordingly, innovation was emphasized in order to advance mutual efforts to establish innovative research and development projects with Third Countries such as Mexico and Thailand through diverse multilateral framework programs such as the Seventh Framework Program (Fp7. A brief evaluation of the existing projects in Fp7 reveal disposition for intraregional cooperation in spite of the disparities regarding the quantity and extension of projects. Moreover, studied participants share a similar lack of know-how for coordinating projects which is at times crucial for benefiting completely from the program. Potential exists for establishing the necessary links and coordinating points amongst Mexico and Thailand under the given cooperation regional and bilateral mechanisms and the extensive research areas that the program covers. It is these specific potentialities enabled by the Fp7 in both regions that intend to be further researched for their development into multiple and successful projects.

  6. ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization

    International Nuclear Information System (INIS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Couet, O.; Franco, L.; Canal, Ph.; Casadei, D.; Fine, V.

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally

  7. A decision analysis framework for stakeholder involvement and learning in groundwater management

    Science.gov (United States)

    Karjalainen, T. P.; Rossi, P. M.; Ala-aho, P.; Eskelinen, R.; Reinikainen, K.; Kløve, B.; Pulido-Velazquez, M.; Yang, H.

    2013-12-01

    Multi-criteria decision analysis (MCDA) methods are increasingly used to facilitate both rigorous analysis and stakeholder involvement in natural and water resource planning. Decision-making in that context is often complex and multi-faceted with numerous trade-offs between social, environmental and economic impacts. However, practical applications of decision-support methods are often too technically oriented and hard to use, understand or interpret for all participants. The learning of participants in these processes is seldom examined, even though successful deliberation depends on learning. This paper analyzes the potential of an interactive MCDA framework, the decision analysis interview (DAI) approach, for facilitating stakeholder involvement and learning in groundwater management. It evaluates the results of the MCDA process in assessing land-use management alternatives in a Finnish esker aquifer area where conflicting land uses affect the groundwater body and dependent ecosystems. In the assessment process, emphasis was placed on the interactive role of the MCDA tool in facilitating stakeholder participation and learning. The results confirmed that the structured decision analysis framework can foster learning and collaboration in a process where disputes and diverse interests are represented. Computer-aided interviews helped the participants to see how their preferences affected the desirability and ranking of alternatives. During the process, the participants' knowledge and preferences evolved as they assessed their initial knowledge with the help of fresh scientific information. The decision analysis process led to the opening of a dialogue, showing the overall picture of the problem context and the critical issues for the further process.

  8. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran; Jonsson, Daniel K.; Lundberg, Kristina; Tyskeng, Sara; Wallgren, Oskar

    2009-01-01

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  9. National water, food, and trade modeling framework: The case of Egypt.

    Science.gov (United States)

    Abdelkader, A; Elshorbagy, A; Tuninetti, M; Laio, F; Ridolfi, L; Fahmy, H; Hoekstra, A Y

    2018-05-22

    This paper introduces a modeling framework for the analysis of real and virtual water flows at national scale. The framework has two components: (1) a national water model that simulates agricultural, industrial and municipal water uses, and available water and land resources; and (2) an international virtual water trade model that captures national virtual water exports and imports related to trade in crops and animal products. This National Water, Food & Trade (NWFT) modeling framework is applied to Egypt, a water-poor country and the world's largest importer of wheat. Egypt's food and water gaps and the country's food (virtual water) imports are estimated over a baseline period (1986-2013) and projected up to 2050 based on four scenarios. Egypt's food and water gaps are growing rapidly as a result of steep population growth and limited water resources. The NWFT modeling framework shows the nexus of the population dynamics, water uses for different sectors, and their compounding effects on Egypt's food gap and water self-sufficiency. The sensitivity analysis reveals that for solving Egypt's water and food problem non-water-based solutions like educational, health, and awareness programs aimed at lowering population growth will be an essential addition to the traditional water resources development solution. Both the national and the global models project similar trends of Egypt's food gap. The NWFT modeling framework can be easily adapted to other nations and regions. Copyright © 2018. Published by Elsevier B.V.

  10. Framework Application for Core Edge Transport Simulation (FACETS)

    Energy Technology Data Exchange (ETDEWEB)

    Malony, Allen D; Shende, Sameer S; Huck, Kevin A; Mr. Alan Morris, and Mr. Wyatt Spear

    2012-03-14

    The goal of the FACETS project (Framework Application for Core-Edge Transport Simulations) was to provide a multiphysics, parallel framework application (FACETS) that will enable whole-device modeling for the U.S. fusion program, to provide the modeling infrastructure needed for ITER, the next step fusion confinement device. Through use of modern computational methods, including component technology and object oriented design, FACETS is able to switch from one model to another for a given aspect of the physics in a flexible manner. This enables use of simplified models for rapid turnaround or high-fidelity models that can take advantage of the largest supercomputer hardware. FACETS does so in a heterogeneous parallel context, where different parts of the application execute in parallel by utilizing task farming, domain decomposition, and/or pipelining as needed and applicable. ParaTools, Inc. was tasked with supporting the performance analysis and tuning of the FACETS components and framework in order to achieve the parallel scaling goals of the project. The TAU Performance System® was used for instrumentation, measurement, archiving, and profile / tracing analysis. ParaTools, Inc. also assisted in FACETS performance engineering efforts. Through the use of the TAU Performance System, ParaTools provided instrumentation, measurement, analysis and archival support for the FACETS project. Performance optimization of key components has yielded significant performance speedups. TAU was integrated into the FACETS build for both the full coupled application and the UEDGE component. The performance database provided archival storage of the performance regression testing data generated by the project, and helped to track improvements in the software development.

  11. Developing a theoretical framework for complex community-based interventions.

    Science.gov (United States)

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  12. Building a Conceptual Framework: Philosophy, Definitions, and Procedure

    OpenAIRE

    Yosef Jabareen

    2009-01-01

    In this paper the author proposes a new qualitative method for building conceptual frameworks for phenomena that are linked to multidisciplinary bodies of knowledge. First, he redefines the key terms of concept, conceptual framework, and conceptual framework analysis. Concept has some components that define it. A conceptual framework is defined as a network or a “plane” of linked concepts. Conceptual framework analysis offers a procedure of theorization for building conceptual frameworks base...

  13. A general numerical analysis program for the superconducting quasiparticle mixer

    Science.gov (United States)

    Hicks, R. G.; Feldman, M. J.; Kerr, A. R.

    1986-01-01

    A user-oriented computer program SISCAP (SIS Computer Analysis Program) for analyzing SIS mixers is described. The program allows arbitrary impedance terminations to be specified at all LO harmonics and sideband frequencies. It is therefore able to treat a much more general class of SIS mixers than the widely used three-frequency analysis, for which the harmonics are assumed to be short-circuited. An additional program, GETCHI, provides the necessary input data to program SISCAP. The SISCAP program performs a nonlinear analysis to determine the SIS junction voltage waveform produced by the local oscillator. The quantum theory of mixing is used in its most general form, treating the large signal properties of the mixer in the time domain. A small signal linear analysis is then used to find the conversion loss and port impedances. The noise analysis includes thermal noise from the termination resistances and shot noise from the periodic LO current. Quantum noise is not considered. Many aspects of the program have been adequately verified and found accurate.

  14. Using the Knowledge to Action Framework in practice: a citation analysis and systematic review.

    Science.gov (United States)

    Field, Becky; Booth, Andrew; Ilott, Irene; Gerrish, Kate

    2014-11-23

    Conceptual frameworks are recommended as a way of applying theory to enhance implementation efforts. The Knowledge to Action (KTA) Framework was developed in Canada by Graham and colleagues in the 2000s, following a review of 31 planned action theories. The framework has two components: Knowledge Creation and an Action Cycle, each of which comprises multiple phases. This review sought to answer two questions: 'Is the KTA Framework used in practice? And if so, how?' This study is a citation analysis and systematic review. The index citation for the original paper was identified on three databases-Web of Science, Scopus and Google Scholar-with the facility for citation searching. Limitations of English language and year of publication 2006-June 2013 were set. A taxonomy categorising the continuum of usage was developed. Only studies applying the framework to implementation projects were included. Data were extracted and mapped against each phase of the framework for studies where it was integral to the implementation project. The citation search yielded 1,787 records. A total of 1,057 titles and abstracts were screened. One hundred and forty-six studies described usage to varying degrees, ranging from referenced to integrated. In ten studies, the KTA Framework was integral to the design, delivery and evaluation of the implementation activities. All ten described using the Action Cycle and seven referred to Knowledge Creation. The KTA Framework was enacted in different health care and academic settings with projects targeted at patients, the public, and nursing and allied health professionals. The KTA Framework is being used in practice with varying degrees of completeness. It is frequently cited, with usage ranging from simple attribution via a reference, through informing planning, to making an intellectual contribution. When the framework was integral to knowledge translation, it guided action in idiosyncratic ways and there was theory fidelity. Prevailing wisdom

  15. Academic Libraries and Quality: An Analysis and Evaluation Framework

    Science.gov (United States)

    Atkinson, Jeremy

    2017-01-01

    The paper proposes and describes a framework for academic library quality to be used by new and more experienced library practitioners and by others involved in considering the quality of academic libraries' services and provision. The framework consists of eight themes and a number of questions to examine within each theme. The framework was…

  16. A Framework to Survey the Energy Efficiency of Installed Motor Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Prakash; Hasanbeigi, Ali; McKane, Aimee

    2013-08-01

    While motors are ubiquitous throughout the globe, there is insufficient data to properly assess their level of energy efficiency across regional boundaries. Furthermore, many of the existing data sets focus on motor efficiency and neglect the connected drive and system. Without a comprehensive survey of the installed motor system base, a baseline energy efficiency of a country or region’s motor systems cannot be developed. The lack of data impedes government agencies, utilities, manufacturers, distributers, and energy managers when identifying where to invest resources to capture potential energy savings, creating programs aimed at reducing electrical energy consumption, or quantifying the impacts of such programs. This paper will outline a data collection framework for use when conducting a survey under a variety of execution models to characterize motor system energy efficiency within a country or region. The framework is intended to standardize the data collected ensuring consistency across independently conducted surveys. Consistency allows for the surveys to be leveraged against each other enabling comparisons to motor system energy efficiencies from other regions. In creating the framework, an analysis of various motor driven systems, including compressed air, pumping, and fan systems, was conducted and relevant parameters characterizing the efficiency of these systems were identified. A database using the framework will enable policymakers and industry to better assess the improvement potential of their installed motor system base particularly with respect to other regions, assisting in efforts to promote improvements to the energy efficiency of motor driven systems.

  17. Mastering openFrameworks creative coding demystified

    CERN Document Server

    Yanc, Chris

    2013-01-01

    This book gives clear and effective instructions, stuffed with practical examples, to build your own fun, stunning and highly-interactive openFrameworks applications. Each chapter is focused differently and has a new theme to it,This book targets visual artists, designers, programmers and those interested in creative coding by getting started with openFrameworks. This book will help you understand the capabilities of openFrameworks to help you create visually stunning and fully interactive applications. You should have a basic knowledge of object oriented programming, such as C++, Java, Python

  18. A Framework for Analysis of Music Similarity Measures

    DEFF Research Database (Denmark)

    Jensen, Jesper Højvang; Christensen, Mads G.; Jensen, Søren Holdt

    2007-01-01

    To analyze specific properties of music similarity measures that the commonly used genre classification evaluation procedure does not reveal, we introduce a MIDI based test framework for music similarity measures. We introduce the framework by example and thus outline an experiment to analyze the...

  19. EVALUE : a computer program for evaluating investments in forest products industries

    Science.gov (United States)

    Peter J. Ince; Philip H. Steele

    1980-01-01

    EVALUE, a FORTRAN program, was developed to provide a framework for cash flow analysis of investment opportunities. EVALUE was designed to assist researchers in evaluating investment feasibility of new technology or new manufacturing processes. This report serves as user documentation for the EVALUE program. EVALUE is briefly described and notes on preparation of a...

  20. Critical analysis of e-health readiness assessment frameworks: suitability for application in developing countries.

    Science.gov (United States)

    Mauco, Kabelo Leonard; Scott, Richard E; Mars, Maurice

    2018-02-01

    Introduction e-Health is an innovative way to make health services more effective and efficient and application is increasing worldwide. e-Health represents a substantial ICT investment and its failure usually results in substantial losses in time, money (including opportunity costs) and effort. Therefore it is important to assess e-health readiness prior to implementation. Several frameworks have been published on e-health readiness assessment, under various circumstances and geographical regions of the world. However, their utility for the developing world is unknown. Methods A literature review and analysis of published e-health readiness assessment frameworks or models was performed to determine if any are appropriate for broad assessment of e-health readiness in the developing world. A total of 13 papers described e-health readiness in different settings. Results and Discussion Eight types of e-health readiness were identified and no paper directly addressed all of these. The frameworks were based upon varying assumptions and perspectives. There was no underlying unifying theory underpinning the frameworks. Few assessed government and societal readiness, and none cultural readiness; all are important in the developing world. While the shortcomings of existing frameworks have been highlighted, most contain aspects that are relevant and can be drawn on when developing a framework and assessment tools for the developing world. What emerged is the need to develop different assessment tools for the various stakeholder sectors. This is an area that needs further research before attempting to develop a more generic framework for the developing world.

  1. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Science.gov (United States)

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  2. Analyzing risks to protected areas using the human modification framework: a Colorado case study

    Science.gov (United States)

    David M. Theobald; Alisa Wade; Grant Wilcox; Nate. Peterson

    2010-01-01

    A framework that organizes natural and protected areas is often used to help understand the potential risks to natural areas and aspects of their ecological and human dimensions. The spatial (or landscape) context of these dynamics is also a critical, but, rarely considered, factor. Common classification systems include the U.S. Geological (USGS) Gap Analysis Program...

  3. ELPSA as A Lesson Design Framework

    Directory of Open Access Journals (Sweden)

    Tom Lowrie

    2015-07-01

    Full Text Available This paper offers a framework for mathematics lesson design that is consistent with the way we learn about, and discover, most things in life. In addition, the framework provides a structure for identifying how mathematical concepts and understanding are acquired and developed. This framework is called ELPSA and represents five learning components, namely: Experience, Language, Pictorial, Symbolic and Applications. This framework has been used in developing lessons and teacher professional programs in Indonesia since 2012 in cooperation with the World Bank. This paper describes the theory that underlines the framework in general and in relation to each inter-connected component. Two explicit learning sequences for classroom practice are described, associated with Pythagoras theorem and probability. This paper then concludes with recommendations for using ELPSA in various institutional contexts.

  4. The Utility of the Memorable Messages Framework as an Intermediary Evaluation Tool for Fruit and Vegetable Consumption in a Nutrition Education Program.

    Science.gov (United States)

    Davis, LaShara A; Morgan, Susan E; Mobley, Amy R

    2016-06-01

    Additional strategies to evaluate the impact of community nutrition education programs on low-income individuals are needed. The objective of this qualitative study was to examine the use of the Memorable Messages Framework as an intermediary nutrition education program evaluation tool to determine what fruit and vegetable messages were reported as memorable and the characteristics of those memorable messages. A convenience sample of low-income, primarily African American adults (N = 58) who previously completed a series of community nutrition education lessons within an urban area of Indiana participated in a focus group (N = 8 focus groups). A lead moderator using a semistructured script conducted the focus groups to determine what information about fruits and vegetables was most memorable from the participants' nutrition lessons and why this information was memorable. All focus group audiotapes were transcribed verbatim and ATLAS.ti software was used to code and identify themes within the data. Participants cited quantity, variety, and the positive nutritional impact of eating fruits and vegetables as most memorable. Information given in the form of recipes was also cited as most memorable. For example, participants referred to the recipe demonstrations as not only fun but also key components of the program that helped with message retention and memorability. Key characteristics of memorable messages included personal relevance and message vividness. These findings indicated that the Memorable Messages Framework may serve as an intermediary program evaluation tool to identify what information and messages are most influential to participants in community nutrition education programs. © 2015 Society for Public Health Education.

  5. CRITIC2: A program for real-space analysis of quantum chemical interactions in solids

    Science.gov (United States)

    Otero-de-la-Roza, A.; Johnson, Erin R.; Luaña, Víctor

    2014-03-01

    We present CRITIC2, a program for the analysis of quantum-mechanical atomic and molecular interactions in periodic solids. This code, a greatly improved version of the previous CRITIC program (Otero-de-la Roza et al., 2009), can: (i) find critical points of the electron density and related scalar fields such as the electron localization function (ELF), Laplacian, … (ii) integrate atomic properties in the framework of Bader’s Atoms-in-Molecules theory (QTAIM), (iii) visualize non-covalent interactions in crystals using the non-covalent interactions (NCI) index, (iv) generate relevant graphical representations including lines, planes, gradient paths, contour plots, atomic basins, … and (v) perform transformations between file formats describing scalar fields and crystal structures. CRITIC2 can interface with the output produced by a variety of electronic structure programs including WIEN2k, elk, PI, abinit, Quantum ESPRESSO, VASP, Gaussian, and, in general, any other code capable of writing the scalar field under study to a three-dimensional grid. CRITIC2 is parallelized, completely documented (including illustrative test cases) and publicly available under the GNU General Public License. Catalogue identifier: AECB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECB_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: yes No. of lines in distributed program, including test data, etc.: 11686949 No. of bytes in distributed program, including test data, etc.: 337020731 Distribution format: tar.gz Programming language: Fortran 77 and 90. Computer: Workstations. Operating system: Unix, GNU/Linux. Has the code been vectorized or parallelized?: Shared-memory parallelization can be used for most tasks. Classification: 7.3. Catalogue identifier of previous version: AECB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 157 Nature of problem: Analysis of quantum

  6. "Back on Track": A Mobile App Observational Study Using Apple's ResearchKit Framework.

    Science.gov (United States)

    Zens, Martin; Woias, Peter; Suedkamp, Norbert P; Niemeyer, Philipp

    2017-02-28

    In March 2015, Apple Inc announced ResearchKit, a novel open-source framework intended to help medical researchers to easily create apps for medical studies. With the announcement of this framework, Apple presented 5 apps built in a beta phase based on this framework. The objective of this study was to better understand decision making in patients with acute anterior cruciate ligament (ACL) ruptures. Here, we describe the development of a ResearchKit app for this study. A multilanguage observatory study was conducted. At first a suitable research topic, target groups, participating territories, and programming method were carefully identified. The ResearchKit framework was used to program the app. A secure server connection was realized via Secure Sockets Layer. A data storage and security concept separating personal information and study data was proposed. Furthermore, an efficient method to allow multilanguage support and distribute the app in many territories was presented. Ethical implications were considered and taken into account regarding privacy policies. An app study based on ResearchKit was developed without comprehensive iPhone Operating System (iOS) development experience. The Apple App Store is a major distribution channel causing significant download rates (>1.200/y) without active recruitment. Preliminary data analysis showed moderate dropout rates and a good quality of data. A total of 180 participants were currently enrolled with 107 actively participating and producing 424 completed surveys in 9 out of 24 months. ResearchKit is an easy-to-use framework and powerful tool to create medical studies. Advantages are the modular built, the extensive reach of iOS devices, and the convenient programming environment. ©Martin Zens, Peter Woias, Norbert P Suedkamp, Philipp Niemeyer. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 28.02.2017.

  7. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network

    Directory of Open Access Journals (Sweden)

    Kim Hyun

    2011-12-01

    Full Text Available Abstract Background Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. Results We herein introduce a framework for network modularization and Bayesian network analysis (FMB to investigate organism’s metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. Conclusions After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  8. Framework for network modularization and Bayesian network analysis to investigate the perturbed metabolic network.

    Science.gov (United States)

    Kim, Hyun Uk; Kim, Tae Yong; Lee, Sang Yup

    2011-01-01

    Genome-scale metabolic network models have contributed to elucidating biological phenomena, and predicting gene targets to engineer for biotechnological applications. With their increasing importance, their precise network characterization has also been crucial for better understanding of the cellular physiology. We herein introduce a framework for network modularization and Bayesian network analysis (FMB) to investigate organism's metabolism under perturbation. FMB reveals direction of influences among metabolic modules, in which reactions with similar or positively correlated flux variation patterns are clustered, in response to specific perturbation using metabolic flux data. With metabolic flux data calculated by constraints-based flux analysis under both control and perturbation conditions, FMB, in essence, reveals the effects of specific perturbations on the biological system through network modularization and Bayesian network analysis at metabolic modular level. As a demonstration, this framework was applied to the genetically perturbed Escherichia coli metabolism, which is a lpdA gene knockout mutant, using its genome-scale metabolic network model. After all, it provides alternative scenarios of metabolic flux distributions in response to the perturbation, which are complementary to the data obtained from conventionally available genome-wide high-throughput techniques or metabolic flux analysis.

  9. The ETLMR MapReduce-Based ETL Framework

    DEFF Research Database (Denmark)

    Xiufeng, Liu; Thomsen, Christian; Pedersen, Torben Bach

    2011-01-01

    This paper presents ETLMR, a parallel Extract--Transform--Load (ETL) programming framework based on MapReduce. It has built-in support for high-level ETL-specific constructs including star schemas, snowflake schemas, and slowly changing dimensions (SCDs). ETLMR gives both high programming...

  10. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach

    International Nuclear Information System (INIS)

    Kontogiannis, Tom

    1997-01-01

    Managing complex industrial systems requires reliable performance of cognitive tasks undertaken by operating crews. The infrequent practice of cognitive skills and the reliance on operator performance for novel situations raised cognitive reliability into an urgent and essential aspect in system design and risk analysis. The aim of this article is to contribute to the development of methods for the analysis of cognitive tasks in complex man-machine interactions. A practical framework is proposed for analysing cognitive errors and enhancing error recovery through interface design. Cognitive errors are viewed as failures in problem solving which are difficult to recover under the task constrains imposed by complex systems. In this sense, the interaction between context and cognition, on the one hand, and the process of error recovery, on the other hand, become the focal points of the proposed framework which is illustrated in an analysis of a simulated emergency

  11. Environmental Stewardship: A Conceptual Review and Analytical Framework

    Science.gov (United States)

    Bennett, Nathan J.; Whitty, Tara S.; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H.

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  12. Environmental Stewardship: A Conceptual Review and Analytical Framework.

    Science.gov (United States)

    Bennett, Nathan J; Whitty, Tara S; Finkbeiner, Elena; Pittman, Jeremy; Bassett, Hannah; Gelcich, Stefan; Allison, Edward H

    2018-04-01

    There has been increasing attention to and investment in local environmental stewardship in conservation and environmental management policies and programs globally. Yet environmental stewardship has not received adequate conceptual attention. Establishing a clear definition and comprehensive analytical framework could strengthen our ability to understand the factors that lead to the success or failure of environmental stewardship in different contexts and how to most effectively support and enable local efforts. Here we propose such a definition and framework. First, we define local environmental stewardship as the actions taken by individuals, groups or networks of actors, with various motivations and levels of capacity, to protect, care for or responsibly use the environment in pursuit of environmental and/or social outcomes in diverse social-ecological contexts. Next, drawing from a review of the environmental stewardship, management and governance literatures, we unpack the elements of this definition to develop an analytical framework that can facilitate research on local environmental stewardship. Finally, we discuss potential interventions and leverage points for promoting or supporting local stewardship and future applications of the framework to guide descriptive, evaluative, prescriptive or systematic analysis of environmental stewardship. Further application of this framework in diverse environmental and social contexts is recommended to refine the elements and develop insights that will guide and improve the outcomes of environmental stewardship initiatives and investments. Ultimately, our aim is to raise the profile of environmental stewardship as a valuable and holistic concept for guiding productive and sustained relationships with the environment.

  13. ELPSA AS A LESSON DESIGN FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Tom Lowrie

    2015-07-01

    Full Text Available This paper offers a framework for mathematics lesson design that is consistent with the way we learn about, and discover, most things in life. In addition, the framework provides a structure for identifying how mathematical concepts and understanding are acquired and developed. This framework is called ELPSA and represents five learning components, namely: Experience, Language, Pictorial, Symbolic and Applications. This framework has been used in developing lessons and teacher professional programs in Indonesia since 2012 in cooperation with the World Bank. This paper describes the theory that underlines the framework in general and in relation to each inter-connected component. Two explicit learning sequences for classroom practice are described, associated with Pythagoras theorem and probability. This paper then concludes with recommendations for using ELPSA in various institutional contexts.Keywords: ELPSA, lesson design framework, Pythagoras theorem, probability DOI: dx.doi.org/10.22342/jme.62.77

  14. Improving regulatory oversight of maintenance programs

    International Nuclear Information System (INIS)

    Cook, S.

    2008-01-01

    Safe nuclear power plant operation requires that risks due to failure or unavailability of Structures, Systems and Components (SSCs) be minimized. Implementation of an effective maintenance program is a key means for achieving this goal. In its regulatory framework, the important relationship between maintenance and safety is acknowledged by the CNSC. A high level maintenance program requirement is included in the Class I Facilities Regulations. In addition, the operating licence contains a condition based on the principle that the design function and performance of SSCs needs to remain consistent with the plant's design and analysis documents. Nuclear power plant licensees have the primary responsibility for safe operation of their facilities and consequently for implementation of a successful maintenance program. The oversight role of the Canadian Nuclear Safety Commission (CNSC) is to ensure that the licensee carries out that responsibility. The challenge for the CNSC is how to do this consistently and efficiently. Three opportunities for improvement to regulatory maintenance oversight are being pursued. These are related to the regulatory framework, compliance verification inspection activities and monitoring of self-reporting. The regulatory framework has been improved by clarifying expectations through the issuance of S-210 'Maintenance Programs for Nuclear Power Plants'. Inspection activities have been improved by introducing new maintenance inspections into the baseline program. Monitoring is being improved by making better use of self-reported and industry produced maintenance related performance indicators. As with any type of program change, the challenge is to ensure the consistent and optimal application of regulatory activities and resources. This paper is a summary of the CNSC's approach to improving its maintenance oversight strategy. (author)

  15. A framework for understanding international medical graduate challenges during transition into fellowship programs.

    Science.gov (United States)

    Sockalingam, Sanjeev; Khan, Attia; Tan, Adrienne; Hawa, Raed; Abbey, Susan; Jackson, Timothy; Zaretsky, Ari; Okrainec, Allan

    2014-01-01

    Previous studies have highlighted unique needs of international medical graduates (IMG) during their transition into medical training programs; however, limited data exist on IMG needs specific to fellowship training. We conducted the following mixed-method study to determine IMG fellow training needs during the transition into fellowship training programs in psychiatry and surgery. The authors conducted a mixed-methods study consisting of an online survey of IMG fellows and their supervisors in psychiatry or surgery fellowship training programs and individual interviews of IMG fellows. The survey assessed (a) fellows' and supervisors' perceptions on IMG challenges in clinical communication, health systems, and education domains and (b) past orientation initiatives. In the second phase of the study, IMG fellows were interviewed during the latter half of their fellowship training, and perceptions regarding orientation and adaptation to fellowship in Canada were assessed. Survey data were analyzed using descriptive and Mann-Whitney U statistics. Qualitative interviews were analyzed using grounded theory methodology. The survey response rate was 76% (35/46) and 69% (35/51) for IMG fellows and supervisors, respectively. Fellows reported the greatest difficulty with adapting to the hospital system, medical documentation, and balancing one's professional and personal life. Supervisors believed that fellows had the greatest difficulty with managing language and slang in Canada, the healthcare system, and an interprofessional team. In Phase 2, fellows generated themes of disorientation, disconnection, interprofessional team challenges, a need for IMG fellow resources, and a benefit from training in a multicultural setting. Our study results highlight the need for IMG specific orientation resources for fellows and supervisors. Maslow's Hierarchy of Needs may be a useful framework for understanding IMG training needs.

  16. The PUMA test program and data analysis

    International Nuclear Information System (INIS)

    Han, J.T.; Morrison, D.L.

    1997-01-01

    The PUMA test program is sponsored by the U.S. Nuclear Regulatory Commission to provide data that are relevant to various Boiling Water Reactor phenomena. The author briefly describes the PUMA test program and facility, presents the objective of the program, provides data analysis for a large-break loss-of-coolant accident test, and compares the data with a RELAP5/MOD 3.1.2 calculation

  17. Evaluation and Policy Analysis: A Communicative Framework

    Directory of Open Access Journals (Sweden)

    Cynthia Wallat

    1997-07-01

    Full Text Available A major challenge for the next generation of students of human development is to help shape the paradigms by which we analyze and evaluate public policies for children and families. Advocates of building research and policy connections point to health care and stress experiences across home, school, and community as critical policy issues that expand the scope of contexts and outcomes studied. At a minimum, development researchers and practitioners will need to be well versed in available methods of inquiry; they will need to be "methodologically multilingual" when conducting evaluation and policy analysis, producing reports, and reporting their interpretations to consumer and policy audiences. This article suggests how traditional approaches to policy inquiry can be reconsidered in light of these research inquiry and communicative skills needed by all policy researchers. A fifteen year review of both policy and discourse processes research is presented to suggest ways to conduct policy studies within a communicative framework.

  18. AIGO: Towards a unified framework for the Analysis and the Inter-comparison of GO functional annotations

    Directory of Open Access Journals (Sweden)

    Defoin-Platel Michael

    2011-11-01

    Full Text Available Abstract Background In response to the rapid growth of available genome sequences, efforts have been made to develop automatic inference methods to functionally characterize them. Pipelines that infer functional annotation are now routinely used to produce new annotations at a genome scale and for a broad variety of species. These pipelines differ widely in their inference algorithms, confidence thresholds and data sources for reasoning. This heterogeneity makes a comparison of the relative merits of each approach extremely complex. The evaluation of the quality of the resultant annotations is also challenging given there is often no existing gold-standard against which to evaluate precision and recall. Results In this paper, we present a pragmatic approach to the study of functional annotations. An ensemble of 12 metrics, describing various aspects of functional annotations, is defined and implemented in a unified framework, which facilitates their systematic analysis and inter-comparison. The use of this framework is demonstrated on three illustrative examples: analysing the outputs of state-of-the-art inference pipelines, comparing electronic versus manual annotation methods, and monitoring the evolution of publicly available functional annotations. The framework is part of the AIGO library (http://code.google.com/p/aigo for the Analysis and the Inter-comparison of the products of Gene Ontology (GO annotation pipelines. The AIGO library also provides functionalities to easily load, analyse, manipulate and compare functional annotations and also to plot and export the results of the analysis in various formats. Conclusions This work is a step toward developing a unified framework for the systematic study of GO functional annotations. This framework has been designed so that new metrics on GO functional annotations can be added in a very straightforward way.

  19. Interdisciplinary research and education in the Vienna Doctoral Programme on Water Resource Systems: a framework for evaluation

    Science.gov (United States)

    Bloeschl, G.; Carr, G.; Loucks, D. P.

    2017-12-01

    Greater understanding of how interdisciplinary research and education evolves is critical for identifying and implementing appropriate programme management strategies. We propose a program evaluation framework that is based on social learning processes (individual learning, interdisciplinary research practices, and interaction between researchers with different backgrounds); social capital outcomes (ability to interact, interpersonal connectivity, and shared understanding); and knowledge and human capital outcomes (new knowledge that integrates multiple research fields). The framework is tested on established case study doctoral program: the Vienna Doctoral Program on Water Resource Systems. Data are collected via mixed qualitative/quantitative methods that include semi-structured interviews, publication co-author analysis, analysis of research proposals, categorisation of the interdisciplinarity of publications and graduate analysis. Through the evaluation and analysis, several interesting findings about how interdisciplinary research evolves and can be supported are identified. Firstly, different aspects of individual learning seem to contribute to a researcher's ability to interact with researchers from other research fields and work collaboratively. These include learning new material from different research fields, learning how to learn new material and learning how to integrate different material. Secondly, shared interdisciplinary research practices can be identified that may be common to other programs and support interaction and shared understanding between different researchers. They include clarification and questioning, harnessing differences and setting defensible research boundaries. Thirdly, intensive interaction between researchers from different backgrounds support connectivity between the researchers, further enabling cross-disciplinary collaborative work. The case study data suggest that social learning processes and social capital outcomes

  20. Assessment of non-linear analysis finite element program (NONSAP) for inelastic analysis

    International Nuclear Information System (INIS)

    Chang, T.Y.; Prachuktam, S.; Reich, M.

    1976-11-01

    An assessment on a nonlinear structural analysis finite element program called NONSAP is given with respect to its inelastic analysis capability for pressure vessels and components. The assessment was made from the review of its theoretical basis and bench mark problem runs. It was found that NONSAP has only limited capability for inelastic analysis. However, the program was written flexible enough that it can be easily extended or modified to suit the user's need. Moreover, some of the numerical difficulties in using NONSAP are pointed out

  1. VESUVIO Data Analysis Goes MANTID

    International Nuclear Information System (INIS)

    Jackson, S; Krzystyniak, M; Seel, A G; Gigg, M; Richards, S E; Fernandez-Alonso, F

    2014-01-01

    This paper describes ongoing efforts to implement the reduction and analysis of neutron Compton scattering data within the MANTID framework. Recently, extensive work has been carried out to integrate the bespoke data reduction and analysis routines written for VESUVIO with the MANTID framework. While the programs described in this document are designed to replicate the functionality of the Fortran and Genie routines already in use, most of them have been written from scratch and are not based on the original code base

  2. VESUVIO Data Analysis Goes MANTID

    Science.gov (United States)

    Jackson, S.; Krzystyniak, M.; Seel, A. G.; Gigg, M.; Richards, S. E.; Fernandez-Alonso, F.

    2014-12-01

    This paper describes ongoing efforts to implement the reduction and analysis of neutron Compton scattering data within the MANTID framework. Recently, extensive work has been carried out to integrate the bespoke data reduction and analysis routines written for VESUVIO with the MANTID framework. While the programs described in this document are designed to replicate the functionality of the Fortran and Genie routines already in use, most of them have been written from scratch and are not based on the original code base.

  3. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Science.gov (United States)

    Convertino, Matteo; Valverde, L James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  4. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Directory of Open Access Journals (Sweden)

    Matteo Convertino

    Full Text Available Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the

  5. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management

    Science.gov (United States)

    Convertino, Matteo; Valverde, L. James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  6. Analysis performed in cooperation with the SALE program, (1)

    International Nuclear Information System (INIS)

    Tuboya, Takao; Wada, Yukio; Suzuki, Takeshi

    1978-01-01

    One of the objects of the SALE (Safeguard Analytical Laboratory Evaluation) program is a development of technique in safeguard and accountability. The SALE program was established by the United States Atomic Energy Commission's New Brunswick Laboratory in 1970. Six years later, SALE program has grown into a worldwide quality control program, receiving analysis results from about 60 laboratories that includes 19 non-U.S. laboratories. All laboratories, participating at present or in the past in the SALE program are listed in Table 1. By 1973, the program was expanded to include six different materials; uranium dioxide (UO 2 ), uranyl nitrate (U-NO 3 ), plutonium dioxide (PuO 2 ), plutonium nitrate (Pu-NO 3 ), uranium-plutonium mixed oxides [(Pu,U)O 2 ], and uranium-plutonium mixed nitrates (Pu-U-NO 3 ). PNC has joined in this program in 1975 for the analysis of samples shown in Table 2. SALE program participants analyze, on a bimonthly basis, materials supplied by the New Brunswick Laboratory (NBL) and report measurement results to NBL for evaluation and inclusion in the bimonthly reports. Present paper describes analysis result and evaluations for these samples which were measured in 1975 -- 1976. (author)

  7. SQL Collaborative Learning Framework Based on SOA

    Science.gov (United States)

    Armiati, S.; Awangga, RM

    2018-04-01

    The research is focused on designing collaborative learning-oriented framework fulfilment service in teaching SQL Oracle 10g. Framework built a foundation of academic fulfilment service performed by a layer of the working unit in collaboration with Program Studi Manajemen Informatika. In the design phase defined what form of collaboration models and information technology proposed for Program Studi Manajemen Informatika by using a framework of collaboration inspired by the stages of modelling a Service Oriented Architecture (SOA). Stages begin with analyzing subsystems, this activity is used to determine subsystem involved and reliance as well as workflow between the subsystems. After the service can be identified, the second phase is designing the component specifications, which details the components that are implemented in the service to include the data, rules, services, profiles can be configured, and variations. The third stage is to allocate service, set the service to the subsystems that have been identified, and its components. Implementation framework contributes to the teaching guides and application architecture that can be used as a landing realize an increase in service by applying information technology.

  8. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  9. Analysis Community’s Coping Strategies and Local Risk Governance Framework in Relation to Landslide

    Directory of Open Access Journals (Sweden)

    Heru Setiawan

    2014-12-01

    Full Text Available Analysis of people perception and analysis of the coping strategy to landslides are the two elements that are es-sential to determine the level of preparedness of communities to landslides. To know the preparedness of government and other stakeholders in facing landslide, the analysis risk governance framework was required. A survey using questionnaires with random sampling was applied to assess the level of people perception and people coping strategy related to landslide. Analysis of risk governance frame work was done at the district and sub-district level. ἀe study found that people perception related with landslide dominated by high and moderate level. Age and education are two factors that inḀuence the people’s perception to landslide. Local people applied four types coping strategy, which are: economic, structural, social and cultural coping strategy. Totally, 51.6% respondents have high level, 33.3% have moderate level and only 15.1% respondents that have low level of coping strategy. ἀe factors that inḀuence the level of coping strategy are education, income and building type.  Analysis of risk governance framework is limited to the three components including stakeholder involvement, risk management and risk communication. Based on the data analysis, the level of stakeholder involvement at the district scope was categorized on the moderate till high and the level of stakeholder involvement at sub-district level was categorized on the high level. Generally, the risk management of Karanganyar was categorized on the moderate level and high level and the risk management in Tawangmangu was categorized on the moderate level. ἀere are some elements must be improved on the risk governance framework, those are data management, the pattern of relationships among stakeholders, increased participation of NGOs, constructed and updated landslide risk map, enhancement of microᴀnance role in helping the com-munity when disaster strikes

  10. Automata-Based Verification of Temporal Properties on Running Programs

    Science.gov (United States)

    Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)

    2001-01-01

    This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  11. TRU Waste Management Program. Cost/schedule optimization analysis

    International Nuclear Information System (INIS)

    Detamore, J.A.; Raudenbush, M.H.; Wolaver, R.W.; Hastings, G.A.

    1985-10-01

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office Rockwell International (JIO/RI) during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance development, task monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short-term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. Systems models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions

  12. Attitudes of Business Students on the TARP Program: A Semantic Differential Analysis

    Science.gov (United States)

    Piotrowski, Chris; Guyette, Roger W., Jr.

    2011-01-01

    The TARP program, a federal response to the 2008 financial crisis, has generated much debate both inside and outside of academia. Since business ethics, corporate responsibility, and public policy form the basic educational framework of the undergraduate business school curriculum, we investigated attitudes toward the TARP in the Spring of 2009.…

  13. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    Science.gov (United States)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  14. Using the framework method for the analysis of qualitative data in multi-disciplinary health research.

    Science.gov (United States)

    Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi

    2013-09-18

    The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.

  15. Celf – A Logical Framework for Deductive and Concurrent Systems (System Description)

    DEFF Research Database (Denmark)

    Schack-Nielsen, Anders; Schürmann, Carsten

    2008-01-01

    CLF (Concurrent LF) [CPWW02a] is a logical framework for specifying and implementing deductive and concurrent systems from areas, such as programming language theory, security protocol analysis, process algebras, and logics. Celf is an implementation of the CLF type theory that extends the LF type...... ML and compiles with MLton, MLKit, and SML/NJ. The source code and a collection of examples are available from http://www.twelf.org/~celf ....

  16. A methodological approach and framework for sustainability assessment in NGO-implemented primary health care programs.

    Science.gov (United States)

    Sarriot, Eric G; Winch, Peter J; Ryan, Leo J; Bowie, Janice; Kouletio, Michelle; Swedberg, Eric; LeBan, Karen; Edison, Jay; Welch, Rikki; Pacqué, Michel C

    2004-01-01

    An estimated 10.8 million children under 5 continue to die each year in developing countries from causes easily treatable or preventable. Non governmental organizations (NGOs) are frontline implementers of low-cost and effective child health interventions, but their progress toward sustainable child health gains is a challenge to evaluate. This paper presents the Child Survival Sustainability Assessment (CSSA) methodology--a framework and process--to map progress towards sustainable child health from the community level and upward. The CSSA was developed with NGOs through a participatory process of research and dialogue. Commitment to sustainability requires a systematic and systemic consideration of human, social and organizational processes beyond a purely biomedical perspective. The CSSA is organized around three interrelated dimensions of evaluation: (1) health and health services; (2) capacity and viability of local organizations; (3) capacity of the community in its social ecological context. The CSSA uses a participatory, action-planning process, engaging a 'local system' of stakeholders in the contextual definition of objectives and indicators. Improved conditions measured in the three dimensions correspond to progress toward a sustainable health situation for the population. This framework opens new opportunities for evaluation and research design and places sustainability at the center of primary health care programming.

  17. Validation of a Framework for Measuring Hospital Disaster Resilience Using Factor Analysis

    Directory of Open Access Journals (Sweden)

    Shuang Zhong

    2014-06-01

    Full Text Available Hospital disaster resilience can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services, and then to recover to its original state or adapt to a new one.” This article aims to provide a framework which can be used to comprehensively measure hospital disaster resilience. An evaluation framework for assessing hospital resilience was initially proposed through a systematic literature review and Modified-Delphi consultation. Eight key domains were identified: hospital safety, command, communication and cooperation system, disaster plan, resource stockpile, staff capability, disaster training and drills, emergency services and surge capability, and recovery and adaptation. The data for this study were collected from 41 tertiary hospitals in Shandong Province in China, using a specially designed questionnaire. Factor analysis was conducted to determine the underpinning structure of the framework. It identified a four-factor structure of hospital resilience, namely, emergency medical response capability (F1, disaster management mechanisms (F2, hospital infrastructural safety (F3, and disaster resources (F4. These factors displayed good internal consistency. The overall level of hospital disaster resilience (F was calculated using the scoring model: F = 0.615F1 + 0.202F2 + 0.103F3 + 0.080F4. This validated framework provides a new way to operationalise the concept of hospital resilience, and it is also a foundation for the further development of the measurement instrument in future studies.

  18. EU Science Diplomacy and Framework Programs as Instruments of STI Cooperation

    Directory of Open Access Journals (Sweden)

    К. А. Ibragimova

    2017-01-01

    Full Text Available This article examines the tools that the EU in interactions with third countries in the field of STI uses. The EU is a pioneer in the use of science and technology in the international arena, the creation of strategic bilateral agreements on science and technology and the conduct of political dialogues at the highest political level (at the country and regional levels. The EU actively uses its foreign policy instruments of influence, including the provision of access to its framework programs to researchers from third countries, as well as scientific diplomacy. The success of these programs and scientific diplomacy shows the effectiveness of the EU as a global actor. In its foreign policy global innovation strategy, the EU proceeds from the premise that no state in the world today can cope independently with modern global challenges such as climate change, migration, terrorism, etc. Therefore, the solution of these issues requires both an expert evaluation from an independent world scientific community, and the perseverance of diplomats and officials of branch ministries of national states capable of conveying the views of their government in international negotiations and defending national interests of the country to find a solution that suits everyone. The EU has the resources to create a "cumulative effect" by developing and applying common norms on the territory of theUnion, analyzing the innovation policies of member states and the possibility of sharing best practices. At the same time, the EU shares its vision of problems, values and priorities with partners and uses the tools of "soft power" (including its smart and normative force and scientific diplomacy in the field of STI. The soft power of the EU in the field of STI lies in the attractiveness of the EU as a research area in which it is possible to conduct modern high-quality international research with the involvement of scientific teams from different countries in both physical

  19. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  20. JWIG: Yet Another Framework for Maintainable and Secure Web Applications

    DEFF Research Database (Denmark)

    Møller, Anders; Schwarz, Mathias Romme

    2009-01-01

    Although numerous frameworks for web application programming have been developed in recent years, writing web applications remains a challenging task. Guided by a collection of classical design principles, we propose yet another framework. It is based on a simple but flexible server......-oriented architecture that coherently supports general aspects of modern web applications, including dynamic XML construction, session management, data persistence, caching, and authentication, but it also simplifies programming of server-push communication and integration of XHTML-based applications and XML-based web...... services.The resulting framework provides a novel foundation for developing maintainable and secure web applications....