WorldWideScience

Sample records for program analysis framework

  1. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2010-01-01

    Get a thorough introduction to ADO.NET Entity Framework 4 -- Microsoft's core framework for modeling and interacting with data in .NET applications. The second edition of this acclaimed guide provides a hands-on tour of the framework latest version in Visual Studio 2010 and .NET Framework 4. Not only will you learn how to use EF4 in a variety of applications, you'll also gain a deep understanding of its architecture and APIs. Written by Julia Lerman, the leading independent authority on the framework, Programming Entity Framework covers it all -- from the Entity Data Model and Object Service

  2. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2009-01-01

    Programming Entity Framework is a thorough introduction to Microsoft's new core framework for modeling and interacting with data in .NET applications. This highly-acclaimed book not only gives experienced developers a hands-on tour of the Entity Framework and explains its use in a variety of applications, it also provides a deep understanding of its architecture and APIs -- knowledge that will be extremely valuable as you shift to the Entity Framework version in .NET Framework 4.0 and Visual Studio 2010. From the Entity Data Model (EDM) and Object Services to EntityClient and the Metadata Work

  3. An evaluation framework and comparative analysis of the widely used first programming languages.

    Directory of Open Access Journals (Sweden)

    Muhammad Shoaib Farooq

    Full Text Available Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL. The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  4. An evaluation framework and comparative analysis of the widely used first programming languages.

    Science.gov (United States)

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  5. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  6. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    Science.gov (United States)

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  7. DEFENSE PROGRAMS RISK MANAGEMENT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Constantin PREDA

    2012-01-01

    Full Text Available For the past years defense programs have faced delays in delivering defense capabilities and budget overruns. Stakeholders are looking for ways to improve program management and the decision making process given the very fluid and uncertain economic and political environment. Consequently, they have increasingly resorted to risk management as the main management tool for achieving defense programs objectives and for delivering the defense capabilities strongly needed for the soldiers on the ground on time and within limited defense budgets. Following a risk management based decision-making approach the stakeholders are expected not only to protect program objectives against a wide range of risks but, at the same time, to take advantage of the opportunities to increase the likelihood of program success. The prerequisite for making risk management the main tool for achieving defense programs objectives is the design and implementation of a strong risk management framework as a foundation providing an efficient and effective application of the best risk management practices. The aim of this paper is to examine the risk management framework for defense programs based on the ISO 31000:2009 standard, best risk management practices and the defense programs’ needs and particularities. For the purposes of this article, the term of defense programs refers to joint defense programs.

  8. The US federal framework for research on endocrine disrupters and an analysis of research programs supported during fiscal year 1996

    Science.gov (United States)

    Reiter, L.W.; DeRosa, C.; Kavlock, R.J.; Lucier, G.; Mac, M.J.; Melillo, J.; Melnick, R.L.; Sinks, T.; Walton, B.T.

    1998-01-01

    The potential health and ecological effects of endocrine disrupting chemicals has become a high visibility environmental issue. The 1990s have witnessed a growing concern, both on the part of the scientific community and the public, that environmental chemicals may be causing widespread effects in humans and in a variety of fish and wildlife species. This growing concern led the Committee on the Environment and Natural Resources (CENR) of the National Science and Technology Council to identify the endocrine disrupter issue as a major research initiative in early 1995 and subsequently establish an ad hoc Working Group on Endocrine Disrupters. The objectives of the working group are to 1) develop a planning framework for federal research related to human and ecological health effects of endocrine disrupting chemicals; 2) conduct an inventory of ongoing federal research programs; and 3) identify research gaps and develop a coordinated interagency plan to address priority research needs. This communication summarizes the activities of the federal government in defining a common framework for planning an endocrine disrupter research program and in assessing the status of the current effort. After developing the research framework and compiling an inventory of active research projects supported by the federal government in fiscal year 1996, the CENR working group evaluated the current federal effort by comparing the ongoing activities with the research needs identified in the framework. The analysis showed that the federal government supports considerable research on human health effects, ecological effects, and exposure assessment, with a predominance of activity occurring under human health effects. The analysis also indicates that studies on reproductive development and carcinogenesis are more prevalent than studies on neurotoxicity and immunotoxicity, that mammals (mostly laboratory animals) are the main species under study, and that chlorinated dibenzodioxins and

  9. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  10. A framework for telehealth program evaluation.

    Science.gov (United States)

    Nepal, Surya; Li, Jane; Jang-Jaccard, Julian; Alem, Leila

    2014-04-01

    Evaluating telehealth programs is a challenging task, yet it is the most sensible first step when embarking on a telehealth study. How can we frame and report on telehealth studies? What are the health services elements to select based on the application needs? What are the appropriate terms to use to refer to such elements? Various frameworks have been proposed in the literature to answer these questions, and each framework is defined by a set of properties covering different aspects of telehealth systems. The most common properties include application, technology, and functionality. With the proliferation of telehealth, it is important not only to understand these properties, but also to define new properties to account for a wider range of context of use and evaluation outcomes. This article presents a comprehensive framework for delivery design, implementation, and evaluation of telehealth services. We first survey existing frameworks proposed in the literature and then present our proposed comprehensive multidimensional framework for telehealth. Six key dimensions of the proposed framework include health domains, health services, delivery technologies, communication infrastructure, environment setting, and socioeconomic analysis. We define a set of example properties for each dimension. We then demonstrate how we have used our framework to evaluate telehealth programs in rural and remote Australia. A few major international studies have been also mapped to demonstrate the feasibility of the framework. The key characteristics of the framework are as follows: (a) loosely coupled and hence easy to use, (b) provides a basis for describing a wide range of telehealth programs, and (c) extensible to future developments and needs.

  11. A PROOF Analysis Framework

    International Nuclear Information System (INIS)

    González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.

  12. Analysis framework for GLORIA

    Science.gov (United States)

    Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian

    2012-05-01

    GLORIA stands for “GLObal Robotic-telescopes Intelligent Array”. GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  13. Replicating MISTERS: an epidemiological criminology framework analysis of a program for criminal justice-involved minority males in the community.

    Science.gov (United States)

    Potter, Roberto Hugh; Akers, Timothy A; Bowman, Daniel Richard

    2013-01-01

    The Men in STD Training and Empowerment Research Study (MISTERS) program and epidemiological criminology began their development in Atlanta at about the same time. MISTERS focuses on men recently released from jail to reduce both HIV/STD and crime-related risk factors through a brief educational intervention. This article examines ways in which MISTERS and epidemiological criminology have been used to inform one another in the replication of the MISTERS program in Orange County, Florida. Data from 110 MISTERS participants during the first 10 months of operation are analyzed to examine the overlapping occurrence of health and criminal risk behaviors in the men's lives. This provides a test of core hypotheses from the epidemiological criminology framework. This article also examines application of the epidemiological criminology framework to develop interventions to address health and crime risk factors simultaneously in Criminal Justice-Involved populations in the community.

  14. Programming Entity Framework Code First

    CERN Document Server

    Lerman, Julia

    2011-01-01

    Take advantage of the Code First data modeling approach in ADO.NET Entity Framework, and learn how to build and configure a model based on existing classes in your business domain. With this concise book, you'll work hands-on with examples to learn how Code First can create an in-memory model and database by default, and how you can exert more control over the model through further configuration. Code First provides an alternative to the database first and model first approaches to the Entity Data Model. Learn the benefits of defining your model with code, whether you're working with an exis

  15. Measuring the performance of vaccination programs using cross-sectional surveys: a likelihood framework and retrospective analysis.

    Directory of Open Access Journals (Sweden)

    Justin Lessler

    2011-10-01

    Full Text Available The performance of routine and supplemental immunization activities is usually measured by the administrative method: dividing the number of doses distributed by the size of the target population. This method leads to coverage estimates that are sometimes impossible (e.g., vaccination of 102% of the target population, and are generally inconsistent with the proportion found to be vaccinated in Demographic and Health Surveys (DHS. We describe a method that estimates the fraction of the population accessible to vaccination activities, as well as within-campaign inefficiencies, thus providing a consistent estimate of vaccination coverage.We developed a likelihood framework for estimating the effective coverage of vaccination programs using cross-sectional surveys of vaccine coverage combined with administrative data. We applied our method to measles vaccination in three African countries: Ghana, Madagascar, and Sierra Leone, using data from each country's most recent DHS survey and administrative coverage data reported to the World Health Organization. We estimate that 93% (95% CI: 91, 94 of the population in Ghana was ever covered by any measles vaccination activity, 77% (95% CI: 78, 81 in Madagascar, and 69% (95% CI: 67, 70 in Sierra Leone. "Within-activity" inefficiencies were estimated to be low in Ghana, and higher in Sierra Leone and Madagascar. Our model successfully fits age-specific vaccination coverage levels seen in DHS data, which differ markedly from those predicted by naïve extrapolation from country-reported and World Health Organization-adjusted vaccination coverage.Combining administrative data with survey data substantially improves estimates of vaccination coverage. Estimates of the inefficiency of past vaccination activities and the proportion not covered by any activity allow us to more accurately predict the results of future activities and provide insight into the ways in which vaccination programs are failing to meet their

  16. A Typology Framework of Loyalty Reward Programs

    Science.gov (United States)

    Cao, Yuheng; Nsakanda, Aaron Luntala; Mann, Inder Jit Singh

    Loyalty reward programs (LRPs), initially developed as marketing programs to enhance customer retention, have now become an important part of customer-focused business strategy. With the proliferation and increasing economy impact of the programs, the management complexity in the programs has also increased. However, despite widespread adoption of LRPs in business, academic research in the field seems to lag behind its practical application. Even the fundamental questions such as what LRPs are and how to classify them have not yet been fully addressed. In this paper, a comprehensive framework for LRP classification is proposed, which provides a foundation for further study of LRP design and planning issues.

  17. A quality framework for addiction treatment programs

    NARCIS (Netherlands)

    Nabitz, Udo; van den Brink, Wim; Walburg, Jan

    2005-01-01

    AIM: To identify and specify the structure and the elements of a quality framework for addiction treatment programs. METHOD: Concept mapping strategy was applied. In brainstorm sessions, 70 statements were generated and rated by 90 representatives of three stakeholder groups. Using multivariate

  18. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  19. Conceptual framework for a Danish human biomonitoring program

    Directory of Open Access Journals (Sweden)

    Fauser Patrik

    2008-01-01

    Full Text Available Abstract The aim of this paper is to present the conceptual framework for a Danish human biomonitoring (HBM program. The EU and national science-policy interface, that is fundamental for a realization of the national and European environment and human health strategies, is discussed, including the need for a structured and integrated environmental and human health surveillance program at national level. In Denmark, the initiative to implement such activities has been taken. The proposed framework of the Danish monitoring program constitutes four scientific expert groups, i.e. i. Prioritization of the strategy for the monitoring program, ii. Collection of human samples, iii. Analysis and data management and iv. Dissemination of results produced within the program. This paper presents the overall framework for data requirements and information flow in the integrated environment and health surveillance program. The added value of an HBM program, and in this respect the objectives of national and European HBM programs supporting environmental health integrated policy-decisions and human health targeted policies, are discussed. In Denmark environmental monitoring has been prioritized by extensive surveillance systems of pollution in oceans, lakes and soil as well as ground and drinking water. Human biomonitoring has only taken place in research programs and few incidences of e.g. lead contamination. However an arctic program for HBM has been in force for decades and from the preparations of the EU-pilot project on HBM increasing political interest in a Danish program has developed.

  20. Conceptual framework for a Danish human biomonitoring program

    DEFF Research Database (Denmark)

    Thomsen, Marianne; Knudsen, Lisbeth E.; Vorkamp, Katrin

    2008-01-01

    of pollution in oceans, lakes and soil as well as ground and drinking water. Human biomonitoring has only taken place in research programs and few incidences of e.g. lead contamination. However an arctic program for HBM has been in force for decades and from the preparations of the EU-pilot project on HBM......The aim of this paper is to present the conceptual framework for a Danish human biomonitoring (HBM) program. The EU and national science-policy interface, that is fundamental for a realization of the national and European environment and human health strategies, is discussed, including the need...... for the monitoring program, ii. Collection of human samples, iii. Analysis and data management and iv. Dissemination of results produced within the program. This paper presents the overall framework for data requirements and information flow in the integrated environment and health surveillance program. The added...

  1. SPANDE, Stress Analysis of General Space-frame and Pipework. SPATAM, Tilt Angle Calculation of Framework for Program SPANDE

    International Nuclear Information System (INIS)

    Davies, D.C.; Enderby, J.A.; Knowles, J.A.

    1984-01-01

    1 - Nature of physical problem solved: The programme is intended to analyse almost any type of space-frame. Members of the frame may be either straight or of constant curvature between nodes provided that, in the case of curved members, one of the principal axes of the cross-section of the member lies in the same plane as the member. Loading may comprise concentrated loads, distributed loads, thermal loads or may take the form of specified displacements. The programme calculates the forces and moments in all the members of the framework, the reactions at all external restraints and the displacements of all the nodes. For pipework problems, the maximum stress difference in the pipe, calculated in accordance with the code of practice, is also quoted. 2 - Method of solution: The framework is solved by displacement methods involving stiffness matrices, making it possible to analyse space-frames with virtually any number of redundancies. 3 - Restrictions on the complexity of the problem: For ICL 4/70, the framework is limited to 1000 nodes, 2000 members, 100 different member types or 200 specified nodal displacements

  2. Initial Multidisciplinary Design and Analysis Framework

    Science.gov (United States)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  3. A Framework for Analysis of Case Studies of Reading Lessons

    Science.gov (United States)

    Carlisle, Joanne F.; Kelcey, Ben; Rosaen, Cheryl; Phelps, Geoffrey; Vereb, Anita

    2013-01-01

    This paper focuses on the development and study of a framework to provide direction and guidance for practicing teachers in using a web-based case studies program for professional development in early reading; the program is called Case Studies Reading Lessons (CSRL). The framework directs and guides teachers' analysis of reading instruction by…

  4. Evolution of a multilevel framework for health program evaluation.

    Science.gov (United States)

    Masso, Malcolm; Quinsey, Karen; Fildes, Dave

    2017-07-01

    A well-conceived evaluation framework increases understanding of a program's goals and objectives, facilitates the identification of outcomes and can be used as a planning tool during program development. Herein we describe the origins and development of an evaluation framework that recognises that implementation is influenced by the setting in which it takes place, the individuals involved and the processes by which implementation is accomplished. The framework includes an evaluation hierarchy that focuses on outcomes for consumers, providers and the care delivery system, and is structured according to six domains: program delivery, impact, sustainability, capacity building, generalisability and dissemination. These components of the evaluation framework fit into a matrix structure, and cells within the matrix are supported by relevant evaluation tools. The development of the framework has been influenced by feedback from various stakeholders, existing knowledge of the evaluators and the literature on health promotion and implementation science. Over the years, the framework has matured and is generic enough to be useful in a wide variety of circumstances, yet specific enough to focus data collection, data analysis and the presentation of findings.

  5. A generalized disjunctive programming framework for the optimal synthesis and analysis of processes for ethanol production from corn stover.

    Science.gov (United States)

    Scott, Felipe; Aroca, Germán; Caballero, José Antonio; Conejeros, Raúl

    2017-07-01

    The aim of this study is to analyze the techno-economic performance of process configurations for ethanol production involving solid-liquid separators and reactors in the saccharification and fermentation stage, a family of process configurations where few alternatives have been proposed. Since including these process alternatives creates a large number of possible process configurations, a framework for process synthesis and optimization is proposed. This approach is supported on kinetic models fed with experimental data and a plant-wide techno-economic model. Among 150 process configurations, 40 show an improved MESP compared to a well-documented base case (BC), almost all include solid separators and some show energy retrieved in products 32% higher compared to the BC. Moreover, 16 of them also show a lower capital investment per unit of ethanol produced per year. Several of the process configurations found in this work have not been reported in the literature. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  7. Luiza: Analysis Framework for GLORIA

    Directory of Open Access Journals (Sweden)

    Aleksander Filip Żarnecki

    2013-01-01

    Full Text Available The Luiza analysis framework for GLORIA is based on the Marlin package, which was originally developed for data analysis in the new High Energy Physics (HEP project, International Linear Collider (ILC. The HEP experiments have to deal with enormous amounts of data and distributed data analysis is therefore essential. The Marlin framework concept seems to be well suited for the needs of GLORIA. The idea (and large parts of the code taken from Marlin is that every computing task is implemented as a processor (module that analyzes the data stored in an internal data structure, and the additional output is also added to that collection. The advantage of this modular approach is that it keeps things as simple as possible. Each step of the full analysis chain, e.g. from raw images to light curves, can be processed step-by-step, and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  8. X-framework: Space system failure analysis framework

    Science.gov (United States)

    Newman, John Steven

    Space program and space systems failures result in financial losses in the multi-hundred million dollar range every year. In addition to financial loss, space system failures may also represent the loss of opportunity, loss of critical scientific, commercial and/or national defense capabilities, as well as loss of public confidence. The need exists to improve learning and expand the scope of lessons documented and offered to the space industry project team. One of the barriers to incorporating lessons learned include the way in which space system failures are documented. Multiple classes of space system failure information are identified, ranging from "sound bite" summaries in space insurance compendia, to articles in journals, lengthy data-oriented (what happened) reports, and in some rare cases, reports that treat not only the what, but also the why. In addition there are periodically published "corporate crisis" reports, typically issued after multiple or highly visible failures that explore management roles in the failure, often within a politically oriented context. Given the general lack of consistency, it is clear that a good multi-level space system/program failure framework with analytical and predictive capability is needed. This research effort set out to develop such a model. The X-Framework (x-fw) is proposed as an innovative forensic failure analysis approach, providing a multi-level understanding of the space system failure event beginning with the proximate cause, extending to the directly related work or operational processes and upward through successive management layers. The x-fw focus is on capability and control at the process level and examines: (1) management accountability and control, (2) resource and requirement allocation, and (3) planning, analysis, and risk management at each level of management. The x-fw model provides an innovative failure analysis approach for acquiring a multi-level perspective, direct and indirect causation of

  9. COMP Superscalar, an interoperable programming framework

    Science.gov (United States)

    Badia, Rosa M.; Conejero, Javier; Diaz, Carlos; Ejarque, Jorge; Lezzi, Daniele; Lordan, Francesc; Ramon-Cortes, Cristian; Sirvent, Raul

    2015-12-01

    COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i) identifying the functions to be executed as asynchronous parallel tasks and (ii) annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  10. COMP Superscalar, an interoperable programming framework

    Directory of Open Access Journals (Sweden)

    Rosa M. Badia

    2015-12-01

    Full Text Available COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i identifying the functions to be executed as asynchronous parallel tasks and (ii annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  11. Static Analysis of Mobile Programs

    Science.gov (United States)

    2017-02-01

    and not allowed, to do. The second issue was that a fully static analysis was never a realistic possibility, because Java , the programming langauge...scale to large programs it had to handle essentially all of the features of Java and could also be used as a general-purpose analysis engine. The...static analysis of imperative languages. • A framework for adding specifications about the behavior of methods, including methods that were

  12. 76 FR 38602 - Bovine Tuberculosis and Brucellosis; Program Framework

    Science.gov (United States)

    2011-07-01

    ...] Bovine Tuberculosis and Brucellosis; Program Framework AGENCY: Animal and Plant Health Inspection Service... framework being developed for the bovine tuberculosis and brucellosis programs in the United States. This... proposed revisions to its programs regarding bovine tuberculosis (TB) and bovine brucellosis in the United...

  13. Framework for SEM contour analysis

    Science.gov (United States)

    Schneider, L.; Farys, V.; Serret, E.; Fenouillet-Beranger, C.

    2017-03-01

    SEM images provide valuable information about patterning capability. Geometrical properties such as Critical Dimension (CD) can be extracted from them and are used to calibrate OPC models, thus making OPC more robust and reliable. However, there is currently a shortage of appropriate metrology tools to inspect complex two-dimensional patterns in the same way as one would work with simple one-dimensional patterns. In this article we present a full framework for the analysis of SEM images. It has been proven to be fast, reliable and robust for every type of structure, and particularly for two-dimensional structures. To achieve this result, several innovative solutions have been developed and will be presented in the following pages. Firstly, we will present a new noise filter which is used to reduce noise on SEM images, followed by an efficient topography identifier, and finally we will describe the use of a topological skeleton as a measurement tool that can extend CD measurements on all kinds of patterns.

  14. Framework for an Effective Assessment and Accountability Program: The Philadelphia Example

    Science.gov (United States)

    Porter, Andrew C.; Chester, Mitchell D.; Schlesinger, Michael D.

    2004-01-01

    The purpose of this article is to put in the hands of researchers, practitioners, and policy makers a powerful framework for building and studying the effects of high-quality assessment and accountability programs. The framework is illustrated through a description and analysis of the assessment and accountability program in the School District of…

  15. Legal framework for a nuclear program

    International Nuclear Information System (INIS)

    Santos, A. de los; Corretjer, L.

    1977-01-01

    Introduction of a nuclear program requires the establishment of an adequate legal framework as solutions to the problems posed by the use of nuclear energy are not included in Common Law. As far as Spain is concerned, legislation is capable of dealing with the main problems posed in this field. Spain is a Contracting Party in several International Conventions and participates in International Organizations related to this area and takes their recommendations into account when revising its national legislation. Specific Spanish legislation is constituted by Law 25/1964, of April 29th, on Nuclear Energy, which outlines the legal system regarding nuclear energy, and regulates all aspects which refer to same, from the competent organisms and authorities to the sanctions to be imposed for non-fulfilment of the provisions. In order to offer sufficient flexibility, so that it can be adapted to specific circumstances, the Law's provisions are very ample and development is foreseen by means of regulations. So far, two Regulations have been published: Regulation relating to Coverage of Risk of Nuclear Damage, which refers to Civil Responsibility and its Coverage; and Regulation relating to Nuclear and Radioactive Installations, which refers to the authorization and license system. At the present time, the Regulation relating to Radiation Protection is being elaborated and it will replace the present Radiation Protection Ordinances. In addition to the foregoing, reference is made to others which, although they are not specifically ''nuclear'', they include precepts related to this question, such as the Regulation regarding Nuisance, Unhealthy or Dangerous Industries or some Labor Law provisions [es

  16. Planning for Program Design and Assessment Using Value Creation Frameworks

    Science.gov (United States)

    Whisler, Laurel; Anderson, Rachel; Brown, Jenai

    2017-01-01

    This article explains a program design and planning process using the Value Creation Framework (VCF) developed by Wenger, Trayner, and de Laat (2011). The framework involves identifying types of value or benefit for those involved in the program, conditions and activities that support creation of that value, data that measure whether the value was…

  17. Statis Program Analysis for Reliable, Trusted Apps

    Science.gov (United States)

    2017-02-01

    and prevent errors in their Java programs. The Checker Framework includes compiler plug-ins (“checkers”) that find bugs or verify their absence. It...versions of the Java language. 4.8 DATAFLOW FRAMEWORK The dataflow framework enables more accurate analysis of source code. (Despite their similar...names, the dataflow framework is independent of the (Information) Flow Checker of chapter 2.) In Java code, a given operation may be permitted or

  18. Ecosystem Analysis Program

    International Nuclear Information System (INIS)

    Burgess, R.L.

    1978-01-01

    Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models

  19. Public health program capacity for sustainability: a new framework.

    Science.gov (United States)

    Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C

    2013-02-01

    Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing

  20. CLARA: CLAS12 Reconstruction and Analysis Framework

    Energy Technology Data Exchange (ETDEWEB)

    Gyurjyan, Vardan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Matta, Sebastian Mancilla [Santa Maria U., Valparaiso, Chile; Oyarzun, Ricardo [Santa Maria U., Valparaiso, Chile

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  1. Evaluation Framework for NASA's Educational Outreach Programs

    Science.gov (United States)

    Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie

    1999-01-01

    The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.

  2. Design and Analysis of Web Application Frameworks

    DEFF Research Database (Denmark)

    Schwarz, Mathias Romme

    -state manipulation vulnerabilities. The hypothesis of this dissertation is that we can design frameworks and static analyses that aid the programmer to avoid such errors. First, we present the JWIG web application framework for writing secure and maintainable web applications. We discuss how this framework solves...... some of the common errors through an API that is designed to be safe by default. Second, we present a novel technique for checking HTML validity for output that is generated by web applications. Through string analysis, we approximate the output of web applications as context-free grammars. We model......Numerous web application frameworks have been developed in recent years. These frameworks enable programmers to reuse common components and to avoid typical pitfalls in web application development. Although such frameworks help the programmer to avoid many common errors, we nd...

  3. XACC - eXtreme-scale Accelerator Programming Framework

    Energy Technology Data Exchange (ETDEWEB)

    2016-11-18

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  4. A Program Management Framework for Facilities Managers

    Science.gov (United States)

    King, Dan

    2012-01-01

    The challenge faced by senior facility leaders is not how to execute a single project, but rather, how to successfully execute a large program consisting of hundreds of projects. Senior facilities officers at universities, school districts, hospitals, airports, and other organizations with extensive facility inventories, typically manage project…

  5. Communicative automata based programming. Society Framework

    Directory of Open Access Journals (Sweden)

    Andrei Micu

    2015-10-01

    Full Text Available One of the aims of this paper is to present a new programming paradigm based on the new paradigms intensively used in IT industry. Implementation of these techniques can improve the quality of code through modularization, not only in terms of entities used by a program, but also in terms of states in which they pass. Another aspect followed in this paper takes into account that in the development of software applications, the transition from the design to the source code is a very expensive step in terms of effort and time spent. Diagrams can hide very important details for simplicity of understanding, which can lead to incorrect or incomplete implementations. To improve this process communicative automaton based programming comes with an intermediate step. We will see how it goes after creating modeling diagrams to communicative automata and then to writing code for each of them. We show how the transition from one step to another is much easier and intuitive.

  6. Framework for Developing a Multimodal Programming Interface Used on Industrial Robots

    Directory of Open Access Journals (Sweden)

    Bogdan Mocan

    2014-12-01

    Full Text Available The proposed approach within this paper shifts the focus from the coordinate based programming of an industrial robot, which currently dominates the field, to an object based programming scheme. The general framework proposed in this paper is designed to perform natural language understanding, gesture integration and semantic analysis which facilitate the development of a multimodal robot programming interface that facilitate an intuitive programming.

  7. Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs

    Science.gov (United States)

    Futch, Valerie A.

    2016-01-01

    This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…

  8. LULU analysis program

    International Nuclear Information System (INIS)

    Crawford, H.J.; Lindstrom, P.J.

    1983-06-01

    Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday

  9. Event Reconstruction and Analysis in the R3BRoot Framework

    International Nuclear Information System (INIS)

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  10. Biodiesel Emissions Analysis Program

    Science.gov (United States)

    Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.

  11. Choosing your IoT programming framework : architectural aspects

    NARCIS (Netherlands)

    Rahman, L.F.; Ozcelebi, T.; Lukkien, J.J.

    2016-01-01

    The Internet of Things (IoT) is turning into practice. To drive innovations, it is crucial that programmers have means to develop IoT applications in the form of IoT programming frameworks. These are toolkits to develop applications according to a certain style or method and that let developers

  12. The 7 th framework program of the EU

    International Nuclear Information System (INIS)

    Gonzalez, E. M.; Serrano, J. A.

    2007-01-01

    The framework Program is the principal community initiative for fostering and supporting R and D in the European Union. its main goal is to improve competitiveness by fundamentally financing research, technological development, demonstration and innovation activities through transnational collaboration between research institutes and firms belong to both the European Union countries and States affiliated as third countries. In addition, it provides financial support to enhancement and coordination of European research infrastructures, promotion and training of research personnel, basic research and, particularly as of the current 7th Framework Program, coordination of national R and D programs and impllementation of European technology platforms (PTEs), which have been conveived to promote strategic research agendas in key sectors with the cooperation of all the involved players. In the wake of the PTEs, different national platforms have been implemented at the national level which are very active in different sectors. (Authors)

  13. Structural Analysis in a Conceptual Design Framework

    Science.gov (United States)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  14. Talking Cure Models: A Framework of Analysis

    Directory of Open Access Journals (Sweden)

    Christopher Marx

    2017-09-01

    Full Text Available Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1 a foundational theory (which suggests how linguistic activity can affect and transform human experience, (2 an experiential problem state (which defines the problem or pathology of the patient, (3 a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state, and (4 a change mechanism (which defines the processes and effects involved in such transformations. The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1 catharsis, (2 symbolization, (3 narrative, (4 metaphor, and (5 neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more

  15. The Measurand Framework: Scaling Exploratory Data Analysis

    Science.gov (United States)

    Schneider, D.; MacLean, L. S.; Kappler, K. N.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired a unique dataset with outstanding spatial and temporal sampling of earth's time varying magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. In order to analyze this sizable dataset, QF has developed an analytical framework to support processing the time series input data and hypothesis testing to evaluate the statistical significance of potential precursory signals. The framework was developed with a need to support legacy, in-house processing but with an eye towards big-data processing with Apache Spark and other modern big data technologies. In this presentation, we describe our framework, which supports rapid experimentation and iteration of candidate signal processing techniques via modular data transformation stages, tracking of provenance, and automatic re-computation of downstream data when upstream data is updated. Furthermore, we discuss how the processing modules can be ported to big data platforms like Apache Spark and demonstrate a migration path from local, in-house processing to cloud-friendly processing.

  16. A flexible framework for secure and efficient program obfuscation.

    Energy Technology Data Exchange (ETDEWEB)

    Solis, John Hector

    2013-03-01

    In this paper, we present a modular framework for constructing a secure and efficient program obfuscation scheme. Our approach, inspired by the obfuscation with respect to oracle machines model of [4], retains an interactive online protocol with an oracle, but relaxes the original computational and storage restrictions. We argue this is reasonable given the computational resources of modern personal devices. Furthermore, we relax the information-theoretic security requirement for computational security to utilize established cryptographic primitives. With this additional flexibility we are free to explore different cryptographic buildingblocks. Our approach combines authenticated encryption with private information retrieval to construct a secure program obfuscation framework. We give a formal specification of our framework, based on desired functionality and security properties, and provide an example instantiation. In particular, we implement AES in Galois/Counter Mode for authenticated encryption and the Gentry-Ramzan [13]constant communication-rate private information retrieval scheme. We present our implementation results and show that non-trivial sized programs can be realized, but scalability is quickly limited by computational overhead. Finally, we include a discussion on security considerations when instantiating specific modules.

  17. Digital Trade Infrastructures: A Framework for Analysis

    Directory of Open Access Journals (Sweden)

    Boriana Boriana

    2018-04-01

    Full Text Available In global supply chains, information about transactions resides in fragmented pockets within business and government systems. The lack of reliable, accurate and complete information makes it hard to detect risks (such as safety, security, compliance and commercial risks and at the same time makes international trade inefficient. The introduction of digital infrastructures that transcend organizational and system domains is driven by the prospect of reducing the fragmentation of information, thereby enabling improved security and efficiency in the trading process. This article develops a digital trade infrastructure framework through an empirically grounded analysis of four digital infrastructures in the trade domain, using the conceptual lens of digital infrastructure.

  18. Developing an evaluation framework for clinical redesign programs: lessons learnt.

    Science.gov (United States)

    Samaranayake, Premaratne; Dadich, Ann; Fitzgerald, Anneke; Zeitz, Kathryn

    2016-09-19

    Purpose The purpose of this paper is to present lessons learnt through the development of an evaluation framework for a clinical redesign programme - the aim of which was to improve the patient journey through improved discharge practices within an Australian public hospital. Design/methodology/approach The development of the evaluation framework involved three stages - namely, the analysis of secondary data relating to the discharge planning pathway; the analysis of primary data including field-notes and interview transcripts on hospital processes; and the triangulation of these data sets to devise the framework. The evaluation framework ensured that resource use, process management, patient satisfaction, and staff well-being and productivity were each connected with measures, targets, and the aim of clinical redesign programme. Findings The application of business process management and a balanced scorecard enabled a different way of framing the evaluation, ensuring measurable outcomes were connected to inputs and outputs. Lessons learnt include: first, the importance of mixed-methods research to devise the framework and evaluate the redesigned processes; second, the need for appropriate tools and resources to adequately capture change across the different domains of the redesign programme; and third, the value of developing and applying an evaluative framework progressively. Research limitations/implications The evaluation framework is limited by its retrospective application to a clinical process redesign programme. Originality/value This research supports benchmarking with national and international practices in relation to best practice healthcare redesign processes. Additionally, it provides a theoretical contribution on evaluating health services improvement and redesign initiatives.

  19. Aura: A Multi-Featured Programming Framework in Python

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available This paper puts forward the design, programming and application of innovative educational software, ‘Aura’ made using Python and PyQt Python bindings. The research paper presents a new concept of using a single tool to relate between syntaxes of various programming languages and algorithms. It radically increases their understanding and retaining capacity, since they can correlate between many programming languages. The software is a totally unorthodox attempt towards helping students who have their first tryst with programming languages. The application is designed to help students understand how algorithms work and thus, help them in learning multiple programming languages on a single platform using an interactive graphical user interface. This paper elucidates how using Python and PyQt bindings, a comprehensive feature rich application, that implements an interactive algorithm building technique, a web browser, multiple programming language framework, a code generator and a real time code sharing hub be embedded into a single interface. And also explains, that using Python as building tool, it requires much less coding than conventional feature rich applications coded in other programming languages, and at the same time does not compromise on stability, inter-operability and robustness of the application.

  20. Integrated framework for dynamic safety analysis

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Karanki, Durga R.

    2012-01-01

    In the conventional PSA (Probabilistic Safety Assessment), detailed plant simulations by independent thermal hydraulic (TH) codes are used in the development of accident sequence models. Typical accidents in a NPP involve complex interactions among process, safety systems, and operator actions. As independent TH codes do not have the models of operator actions and full safety systems, they cannot literally simulate the integrated and dynamic interactions of process, safety systems, and operator responses. Offline simulation with pre decided states and time delays may not model the accident sequences properly. Moreover, when stochastic variability in responses of accident models is considered, defining all the combinations for simulations will be cumbersome task. To overcome some of these limitations of conventional safety analysis approach, TH models are coupled with the stochastic models in the dynamic event tree (DET) framework, which provides flexibility to model the integrated response due to better communication as all the accident elements are in the same model. The advantages of this framework also include: Realistic modeling in dynamic scenarios, comprehensive results, integrated approach (both deterministic and probabilistic models), and support for HRA (Human Reliability Analysis)

  1. High Speed Simulation Framework for Reliable Logic Programs

    International Nuclear Information System (INIS)

    Lee, Wan-Bok; Kim, Seog-Ju

    2006-01-01

    This paper shows a case study of designing a PLC logic simulator that was developed to simulate and verify PLC control programs for nuclear plant systems. The nuclear control system requires strict restrictions rather than normal process control system does, since it works with nuclear power plants requiring high reliability under severe environment. One restriction is the safeness of the control programs which can be assured by exploiting severe testing. Another restriction is the simulation speed of the control programs, that should be fast enough to control multi devices concurrently in real-time. To cope with these restrictions, we devised a logic compiler which generates C-code programs from given PLC logic programs. Once the logic program was translated into C-code, the program could be analyzed by conventional software analysis tools and could be used to construct a fast logic simulator after cross-compiling, in fact, that is a kind of compiled-code simulation

  2. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  3. The Event Coordination Notation: Execution Engine and Programming Framework

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2012-01-01

    that was written manually. In this paper, we rephrase the main concepts of ECNO. The focus of this paper, however, is on the architecture of the ECNO execution engine and its programming framework. We will show how this framework allows us to integrate ECNO with object-oriented models, how it works without any......ECNO (Event Coordination Notation) is a notation for modelling the behaviour of a software system on top of some object-oriented data model. ECNO has two main objectives: On the one hand, ECNO should allow modelling the behaviour of a system on the domain level; on the other hand, it should...... be possible to completely generate code from ECNO and the underlying object-oriented domain models. Today, there are several approaches that would allow to do this. But, most of them would require that the data models and the behaviour models are using the same technology and the code is generated together...

  4. Evaluation and Policy Analysis: A Communicative Framework

    Directory of Open Access Journals (Sweden)

    Cynthia Wallat

    1997-07-01

    Full Text Available A major challenge for the next generation of students of human development is to help shape the paradigms by which we analyze and evaluate public policies for children and families. Advocates of building research and policy connections point to health care and stress experiences across home, school, and community as critical policy issues that expand the scope of contexts and outcomes studied. At a minimum, development researchers and practitioners will need to be well versed in available methods of inquiry; they will need to be "methodologically multilingual" when conducting evaluation and policy analysis, producing reports, and reporting their interpretations to consumer and policy audiences. This article suggests how traditional approaches to policy inquiry can be reconsidered in light of these research inquiry and communicative skills needed by all policy researchers. A fifteen year review of both policy and discourse processes research is presented to suggest ways to conduct policy studies within a communicative framework.

  5. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Logical Framework Analysis (LFA): An Essential Tool for Designing Agricultural Project ... overview of the process and the structure of the Logical Framework Matrix or Logframe, derivable from it, ..... System Approach to Managing The Project.

  6. An Analysis of Massachusetts Department of Elementary and Secondary Education Vocational Technical Education Framework for Culinary Arts and Its Effectiveness on Students Enrolled in Post-Secondary Culinary Programs

    Science.gov (United States)

    D'Addario, Albert S.

    2011-01-01

    This field-based action research practicum investigated how students who have completed culinary training programs in Massachusetts public secondary schools perform in post-secondary coursework. The Department of Elementary and Secondary Education has developed the Vocational Technical Education (VTE) Framework for Culinary Arts that outlines…

  7. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  8. Imperial College near infrared spectroscopy neuroimaging analysis framework.

    Science.gov (United States)

    Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong

    2018-01-01

    This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.

  9. Mississippi Curriculum Framework for Welding and Cutting Programs (Program CIP: 48.0508--Welder/Welding Technologist). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the welding and cutting programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…

  10. Mississippi Curriculum Framework for Banking & Finance Technology (Program CIP: 52.0803--Banking and Related Financial Programs, Other). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the banking and finance technology program. Presented in the introduction are a program description and suggested course sequence. Section I is a curriculum guide consisting of outlines for…

  11. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    OpenAIRE

    Sitek, Paweł; Wikarek, Jarosław

    2016-01-01

    This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs) and constraint optimization problems (COPs). Two paradigms, CLP (constraint logic programming) and MP (mathematical programming), are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework a...

  12. Using Framework Analysis in nursing research: a worked example.

    Science.gov (United States)

    Ward, Deborah J; Furber, Christine; Tierney, Stephanie; Swallow, Veronica

    2013-11-01

    To demonstrate Framework Analysis using a worked example and to illustrate how criticisms of qualitative data analysis including issues of clarity and transparency can be addressed. Critics of the analysis of qualitative data sometimes cite lack of clarity and transparency about analytical procedures; this can deter nurse researchers from undertaking qualitative studies. Framework Analysis is flexible, systematic, and rigorous, offering clarity, transparency, an audit trail, an option for theme-based and case-based analysis and for readily retrievable data. This paper offers further explanation of the process undertaken which is illustrated with a worked example. Data were collected from 31 nursing students in 2009 using semi-structured interviews. The data collected are not reported directly here but used as a worked example for the five steps of Framework Analysis. Suggestions are provided to guide researchers through essential steps in undertaking Framework Analysis. The benefits and limitations of Framework Analysis are discussed. Nurses increasingly use qualitative research methods and need to use an analysis approach that offers transparency and rigour which Framework Analysis can provide. Nurse researchers may find the detailed critique of Framework Analysis presented in this paper a useful resource when designing and conducting qualitative studies. Qualitative data analysis presents challenges in relation to the volume and complexity of data obtained and the need to present an 'audit trail' for those using the research findings. Framework Analysis is an appropriate, rigorous and systematic method for undertaking qualitative analysis. © 2013 Blackwell Publishing Ltd.

  13. The international framework for safeguarding peaceful nuclear energy programs

    International Nuclear Information System (INIS)

    Mazer, B.M.

    1980-01-01

    International law, in response to the need for safeguard assurances, has provided a framework which can be utilized by supplier and recipient states. Multilateral treaties have created the International Atomic Energy Agency which can serve a vital role in the establishment and supervision of safeguard agreements for nuclear energy programs. The Non-Proliferation Treaty has created definite obligations on nuclear-weapon and non-nuclear weapon states to alleviate some possibilities of proliferation and has rejuvenated the function of the IAEA in providing safeguards, especially to non-nuclear-weapon states which are parties to the Non-Proliferation treaty. States which are not parties to the Non-Proliferation Treaty may receive nuclear energy co-operation subject to IAEA safeguards. States like Canada, have insisted through the bilateral nuclear energy co-operation agreements that either individual or joint agreement be reached with the IAEA for the application of safeguards. Trilateral treaties among Canada, the recipient state and the IAEA have been employed and can provide the necessary assurances against the diversion of peaceful nuclear energy programs to military or non-peaceful uses. The advent of the Nuclear Suppliers Group and its guidlines has definitely advanced the cause of ensuring peaceful uses of nuclear energy. The ultimate objective should be the creation of an international structure incorporating the application of the most comprehensive safeguards which will be applied universally to all nuclear energy programs

  14. Environmental conditions analysis program

    International Nuclear Information System (INIS)

    Holten, J.

    1991-01-01

    The PC-based program discussed in this paper has the capability of determining the steady state temperatures of environmental zones (rooms). A program overview will be provided along with examples of formula use. Required input and output from the program will also be discussed. Specific application of plant monitored temperatures and utilization of this program will be offered. The presentation will show how the program can project individual room temperature profiles without continual temperature monitoring of equipment. A discussion will also be provided for the application of the program generated data. Evaluations of anticipated or planned plant modifications and the use of the subject program will also be covered

  15. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    Science.gov (United States)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  16. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  17. A Simulation Modeling Framework to Optimize Programs Using Financial Incentives to Motivate Health Behavior Change.

    Science.gov (United States)

    Basu, Sanjay; Kiernan, Michaela

    2016-01-01

    While increasingly popular among mid- to large-size employers, using financial incentives to induce health behavior change among employees has been controversial, in part due to poor quality and generalizability of studies to date. Thus, fundamental questions have been left unanswered: To generate positive economic returns on investment, what level of incentive should be offered for any given type of incentive program and among which employees? We constructed a novel modeling framework that systematically identifies how to optimize marginal return on investment from programs incentivizing behavior change by integrating commonly collected data on health behaviors and associated costs. We integrated "demand curves" capturing individual differences in response to any given incentive with employee demographic and risk factor data. We also estimated the degree of self-selection that could be tolerated: that is, the maximum percentage of already-healthy employees who could enroll in a wellness program while still maintaining positive absolute return on investment. In a demonstration analysis, the modeling framework was applied to data from 3000 worksite physical activity programs across the nation. For physical activity programs, the incentive levels that would optimize marginal return on investment ($367/employee/year) were higher than average incentive levels currently offered ($143/employee/year). Yet a high degree of self-selection could undermine the economic benefits of the program; if more than 17% of participants came from the top 10% of the physical activity distribution, the cost of the program would be expected to always be greater than its benefits. Our generalizable framework integrates individual differences in behavior and risk to systematically estimate the incentive level that optimizes marginal return on investment. © The Author(s) 2015.

  18. Analysis of legal narratives: a conceptual framework

    NARCIS (Netherlands)

    Sileno, G.; Boer, A.; van Engers, T.; Schäfer, B.

    2012-01-01

    This article presents a conceptual framework intended to describe and to abstract cases or scenarios of compliance and non-compliance. These scenarios are collected in order to be animated in an agent-based platform for purposes of design and validation of both new regulations and new

  19. Linux Incident Response Volatile Data Analysis Framework

    Science.gov (United States)

    McFadden, Matthew

    2013-01-01

    Cyber incident response is an emphasized subject area in cybersecurity in information technology with increased need for the protection of data. Due to ongoing threats, cybersecurity imposes many challenges and requires new investigative response techniques. In this study a Linux Incident Response Framework is designed for collecting volatile data…

  20. A multilevel evolutionary framework for sustainability analysis

    Directory of Open Access Journals (Sweden)

    Timothy M. Waring

    2015-06-01

    Full Text Available Sustainability theory can help achieve desirable social-ecological states by generalizing lessons across contexts and improving the design of sustainability interventions. To accomplish these goals, we argue that theory in sustainability science must (1 explain the emergence and persistence of social-ecological states, (2 account for endogenous cultural change, (3 incorporate cooperation dynamics, and (4 address the complexities of multilevel social-ecological interactions. We suggest that cultural evolutionary theory broadly, and cultural multilevel selection in particular, can improve on these fronts. We outline a multilevel evolutionary framework for describing social-ecological change and detail how multilevel cooperative dynamics can determine outcomes in environmental dilemmas. We show how this framework complements existing sustainability frameworks with a description of the emergence and persistence of sustainable institutions and behavior, a means to generalize causal patterns across social-ecological contexts, and a heuristic for designing and evaluating effective sustainability interventions. We support these assertions with case examples from developed and developing countries in which we track cooperative change at multiple levels of social organization as they impact social-ecological outcomes. Finally, we make suggestions for further theoretical development, empirical testing, and application.

  1. The SBIRT program matrix: a conceptual framework for program implementation and evaluation.

    Science.gov (United States)

    Del Boca, Frances K; McRee, Bonnie; Vendetti, Janice; Damon, Donna

    2017-02-01

    Screening, Brief Intervention and Referral to Treatment (SBIRT) is a comprehensive, integrated, public health approach to the delivery of services to those at risk for the adverse consequences of alcohol and other drug use, and for those with probable substance use disorders. Research on successful SBIRT implementation has lagged behind studies of efficacy and effectiveness. This paper (1) outlines a conceptual framework, the SBIRT Program Matrix, to guide implementation research and program evaluation and (2) specifies potential implementation outcomes. Overview and narrative description of the SBIRT Program Matrix. The SBIRT Program Matrix has five components, each of which includes multiple elements: SBIRT services; performance sites; provider attributes; patient/client populations; and management structure and activities. Implementation outcomes include program adoption, acceptability, appropriateness, feasibility, fidelity, costs, penetration, sustainability, service provision and grant compliance. The Screening, Brief Intervention and Referral to Treatment Program Matrix provides a template for identifying, classifying and organizing the naturally occurring commonalities and variations within and across SBIRT programs, and for investigating which variables are associated with implementation success and, ultimately, with treatment outcomes and other impacts. © 2017 Society for the Study of Addiction.

  2. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  3. VisRseq: R-based visual framework for analysis of sequencing data

    OpenAIRE

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven JM

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for ...

  4. Cost-effectiveness analysis of the diarrhea alleviation through zinc and oral rehydration therapy (DAZT) program in rural Gujarat India: an application of the net-benefit regression framework.

    Science.gov (United States)

    Shillcutt, Samuel D; LeFevre, Amnesty E; Fischer-Walker, Christa L; Taneja, Sunita; Black, Robert E; Mazumder, Sarmila

    2017-01-01

    This study evaluates the cost-effectiveness of the DAZT program for scaling up treatment of acute child diarrhea in Gujarat India using a net-benefit regression framework. Costs were calculated from societal and caregivers' perspectives and effectiveness was assessed in terms of coverage of zinc and both zinc and Oral Rehydration Salt. Regression models were tested in simple linear regression, with a specified set of covariates, and with a specified set of covariates and interaction terms using linear regression with endogenous treatment effects was used as the reference case. The DAZT program was cost-effective with over 95% certainty above $5.50 and $7.50 per appropriately treated child in the unadjusted and adjusted models respectively, with specifications including interaction terms being cost-effective with 85-97% certainty. Findings from this study should be combined with other evidence when considering decisions to scale up programs such as the DAZT program to promote the use of ORS and zinc to treat child diarrhea.

  5. Risk and train control : a framework for analysis

    Science.gov (United States)

    2001-01-01

    This report develops and demonstrates a framework for examining the effects of various train control strategies on some of the major risks of railroad operations. Analysis of hypothetical 1200-mile corridor identified the main factors that increase r...

  6. An analysis of a national strategic framework to promote tourism ...

    African Journals Online (AJOL)

    An analysis of a national strategic framework to promote tourism, leisure, sport and ... is to highlight the extent to which selected macro policy components namely, ... tourism growth, tourism safety and security, environmental management and ...

  7. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  8. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    Clayton, R.

    2000-01-01

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  9. Security Analysis of Parlay/OSA Framework

    NARCIS (Netherlands)

    Corin, R.J.; Di Caprio, G.; Etalle, Sandro; Gnesi, S.; Lenzini, Gabriele; Moiso, C.; Villain, B.

    2004-01-01

    This paper analyzes the security of the Trust and Security Management (TSM) protocol, an authentication protocol which is part of the Parlay/OSA Application Program Interfaces (APIs). Architectures based on Parlay/OSA APIs allow third party service providers to develop new services that can access,

  10. Security Analysis of Parlay/OSA Framework

    NARCIS (Netherlands)

    Corin, R.J.; Di Caprio, G.; Etalle, Sandro; Gnesi, S.; Lenzini, Gabriele; Moiso, C.

    This paper analyzes the security of the Trust and Security Management (TSM) protocol, an authentication protocol which is part of the Parlay/OSA Application Program Interfaces (APIs). Architectures based on Parlay/OSA APIs allow third party service providers to develop new services that can access,

  11. Systems theory as a framework for examining a college campus-based support program for the former foster youth.

    Science.gov (United States)

    Schelbe, Lisa; Randolph, Karen A; Yelick, Anna; Cheatham, Leah P; Groton, Danielle B

    2018-01-01

    Increased attention to former foster youth pursuing post-secondary education has resulted in the creation of college campus based support programs to address their need. However, limited empirical evidence and theoretical knowledge exist about these programs. This study seeks to describe the application of systems theory as a framework for examining a college campus based support program for former foster youth. In-depth semi-structured interviews were conducted with 32 program stakeholders including students, mentors, collaborative members, and independent living program staff. Using qualitative data analysis software, holistic coding techniques were employed to analyze interview transcripts. Then applying principles of extended case method using systems theory, data were analyzed. Findings suggest systems theory serves as a framework for understanding the functioning of a college campus based support program. The theory's concepts help delineate program components and roles of stakeholders; outline boundaries between and interactions among stakeholders; and identify program strengths and weakness. Systems theory plays an important role in identifying intervention components and providing a structure through which to identify and understand program elements as a part of the planning process. This study highlights the utility of systems theory as a framework for program planning and evaluation.

  12. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  13. SPATIAL ANALYSIS FRAMEWORK FOR MANGROVE FORESTS RESTORATION

    Directory of Open Access Journals (Sweden)

    Arimatéa de Carvalho Ximenes

    2016-09-01

    Full Text Available Mangroves are coastal ecosystems in transition between sea and land, localized worldwide on the tropical and subtropical regions. However, anthropogenic pressure in coastal areas has led to the conversion of many mangrove areas to other uses. Due to the increased awareness of the importance of mangroves worldwide, restoration methods are being studied. Our aim is to develop a framework for selecting suitable sites for red mangrove planting using Geographic Information Systems (GIS. For this reason, the methodology is based on abiotic factors that have an influence on the zonation (distribution and growing of the Rhizophora mangle. A total suitable area of 6,12 hectares was found, where 15.300 propagules could be planted.

  14. Programs for nuclear data analysis

    International Nuclear Information System (INIS)

    Bell, R.A.I.

    1975-01-01

    The following report details a number of programs and subroutines which are useful for analysis of data from nuclear physics experiments. Most of them are available from pool pack 005 on the IBM1800 computer. All of these programs are stored there as core loads, and the subroutines and functions in relocatable format. The nature and location of other programs are specified as appropriate. (author)

  15. GAP Analysis Program (GAP)

    Data.gov (United States)

    Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...

  16. Liquid Effluents Program mission analysis

    International Nuclear Information System (INIS)

    Lowe, S.S.

    1994-01-01

    Systems engineering is being used to identify work to cleanup the Hanford Site. The systems engineering process transforms an identified mission need into a set of performance parameters and a preferred system configuration. Mission analysis is the first step in the process. Mission analysis supports early decision-making by clearly defining the program objectives, and evaluating the feasibility and risks associated with achieving those objectives. The results of the mission analysis provide a consistent basis for subsequent systems engineering work. A mission analysis was performed earlier for the overall Hanford Site. This work was continued by a ''capstone'' team which developed a top-level functional analysis. Continuing in a top-down manner, systems engineering is now being applied at the program and project levels. A mission analysis was conducted for the Liquid Effluents Program. The results are described herein. This report identifies the initial conditions and acceptable final conditions, defines the programmatic and physical interfaces and sources of constraints, estimates the resources to carry out the mission, and establishes measures of success. The mission analysis reflects current program planning for the Liquid Effluents Program as described in Liquid Effluents FY 1995 Multi-Year Program Plan

  17. Static Analysis of Functional Programs

    NARCIS (Netherlands)

    van den Berg, Klaas; van den Broek, P.M.

    1994-01-01

    In this paper, the static analysis of programs in the functional programming language Miranda is described based on two graph models. A new control-flow graph model of Miranda definitions is presented, and a model with four classes of caligraphs. Standard software metrics are applicable to these

  18. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  19. Framework for the analysis of crystallization operations

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; Abdul Samad, Noor Asma Fazli Bin; Gernaey, Krist

    Crystallization is often applied in the production of salts and/oractive pharmaceutical ingredients (API), and the crystallization step is an essential part of the manufacturing process for many chemicals-based products.In recent years the monitoring and analysis of crystallization operations has...

  20. An anomaly analysis framework for database systems

    NARCIS (Netherlands)

    Vavilis, S.; Egner, A.I.; Petkovic, M.; Zannone, N.

    2015-01-01

    Anomaly detection systems are usually employed to monitor database activities in order to detect security incidents. These systems raise an alert when anomalous activities are detected. The raised alerts have to be analyzed to timely respond to the security incidents. Their analysis, however, is

  1. XML Graphs in Program Analysis

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael I.

    2011-01-01

    of XML graphs against different XML schema languages, and provide a software package that enables others to make use of these ideas. We also survey the use of XML graphs for program analysis with four very different languages: XACT (XML in Java), Java Servlets (Web application programming), XSugar......XML graphs have shown to be a simple and effective formalism for representing sets of XML documents in program analysis. It has evolved through a six year period with variants tailored for a range of applications. We present a unified definition, outline the key properties including validation...

  2. Fenix, A Fault Tolerant Programming Framework for MPI Applications

    Energy Technology Data Exchange (ETDEWEB)

    2016-10-05

    Fenix provides APIs to allow the users to add fault tolerance capability to MPI-based parallel programs in a transparent manner. Fenix-enabled programs can run through process failures during program execution using a pool of spare processes accommodated by Fenix.

  3. Establishing a framework for comparative analysis of genome sequences

    Energy Technology Data Exchange (ETDEWEB)

    Bansal, A.K.

    1995-06-01

    This paper describes a framework and a high-level language toolkit for comparative analysis of genome sequence alignment The framework integrates the information derived from multiple sequence alignment and phylogenetic tree (hypothetical tree of evolution) to derive new properties about sequences. Multiple sequence alignments are treated as an abstract data type. Abstract operations have been described to manipulate a multiple sequence alignment and to derive mutation related information from a phylogenetic tree by superimposing parsimonious analysis. The framework has been applied on protein alignments to derive constrained columns (in a multiple sequence alignment) that exhibit evolutionary pressure to preserve a common property in a column despite mutation. A Prolog toolkit based on the framework has been implemented and demonstrated on alignments containing 3000 sequences and 3904 columns.

  4. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    OpenAIRE

    Schmidt, Ralph; Bostelmann, Jonas; Cornet, Yves; Heipke, Christian; Philippe, Christian; Poncelet, Nadia; de Rosa, Diego; Vandeloise, Yannick

    2012-01-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk assoc...

  5. Seismic analysis program group: SSAP

    International Nuclear Information System (INIS)

    Uchida, Masaaki

    2002-05-01

    A group of programs SSAP has been developed, each member of which performs seismic calculation using simple single-mass system model or multi-mass system model. For response of structures to a transverse s-wave, a single-mass model program calculating response spectrum and a multi-mass model program are available. They perform calculation using the output of another program, which produces simulated earthquakes having the so-called Ohsaki-spectrum characteristic. Another program has been added, which calculates the response of one-dimensional multi-mass systems to vertical p-wave input. It places particular emphasis on the analysis of the phenomena observed at some shallow earthquakes in which stones jump off the ground. Through a series of test calculations using these programs, some interesting information has been derived concerning the validity of superimposing single-mass model calculation, and also the condition for stones to jump. (author)

  6. A Framework for Collaborative Networked Learning in Higher Education: Design & Analysis

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2014-06-01

    Full Text Available This paper presents a comprehensive framework for building collaborative learning networks within higher educational institutions. This framework focuses on systems design and implementation issues in addition to a complete set of evaluation, and analysis tools. The objective of this project is to improve the standards of higher education in Jordan through the implementation of transparent, collaborative, innovative, and modern quality educational programs. The framework highlights the major steps required to plan, design, and implement collaborative learning systems. Several issues are discussed such as unification of courses and program of studies, using appropriate learning management system, software design development using Agile methodology, infrastructure design, access issues, proprietary data storage, and social network analysis (SNA techniques.

  7. Strategy analysis frameworks for strategy orientation and focus

    OpenAIRE

    Isoherranen, V. (Ville)

    2012-01-01

    Abstract The primary research target of this dissertation is to develop new strategy analysis frameworks, focusing on analysing changes in strategic position as a function of variations in life cycle s-curve/time/typology/market share/orientation. Research is constructive and qualitative by nature, with case study methodology being the adopted approach. The research work is carried out as a compilation dissertation containing four (4) journal articles. The theoretical framework of thi...

  8. A framework for intelligent reliability centered maintenance analysis

    International Nuclear Information System (INIS)

    Cheng Zhonghua; Jia Xisheng; Gao Ping; Wu Su; Wang Jianzhao

    2008-01-01

    To improve the efficiency of reliability-centered maintenance (RCM) analysis, case-based reasoning (CBR), as a kind of artificial intelligence (AI) technology, was successfully introduced into RCM analysis process, and a framework for intelligent RCM analysis (IRCMA) was studied. The idea for IRCMA is based on the fact that the historical records of RCM analysis on similar items can be referenced and used for the current RCM analysis of a new item. Because many common or similar items may exist in the analyzed equipment, the repeated tasks of RCM analysis can be considerably simplified or avoided by revising the similar cases in conducting RCM analysis. Based on the previous theory studies, an intelligent RCM analysis system (IRCMAS) prototype was developed. This research has focused on the description of the definition, basic principles as well as a framework of IRCMA, and discussion of critical techniques in the IRCMA. Finally, IRCMAS prototype is presented based on a case study

  9. Overview of the NRC/EPRI common cause analysis framework

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Worledge, D.H.; Mosleh, A.; Fleming, K.; Parry, G.W.; Paula, H.

    1988-01-01

    This paper presents an overview of a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures whose causes are not explicitly included in the logic model as basic events. The emphasis here is on providing guidelines for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework comprises four major stages: (1) Logic Model Development, (2) Identification of Common Cause Component Groups, (3) Common Cause Modeling and Data Analysis, and (4) Quantification and Interpretation of Results. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. 25 references

  10. Program Analysis Scenarios in Rascal

    NARCIS (Netherlands)

    M.A. Hills (Mark); P. Klint (Paul); J.J. Vinju (Jurgen); F. Durán

    2012-01-01

    textabstractRascal is a meta programming language focused on the implementation of domain-specific languages and on the rapid construction of tools for software analysis and software transformation. In this paper we focus on the use of Rascal for software analysis. We illustrate a range of scenarios

  11. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  12. A New Take on Program Planning: A Faculty Competencies Framework

    Science.gov (United States)

    Sanford, Rania; Kinch, Amy Fowler

    2016-01-01

    Building on previous studies on the changing nature of faculty work, this article presents a conceptual framework for faculty professional success. The authors report on the perceptions of 300 faculty development professionals regarding the importance of skills in nine competency domains: teaching; research; leadership; diversity, inclusion and…

  13. ANALYSIS FRAMEWORKS OF THE COLLABORATIVE INNOVATION PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Dan SERGHIE

    2014-12-01

    Full Text Available Time management is one of the resources by which we can achieve improved performance innovation. This perspective of resource management and process efficiency by reducing the timing of incubation of ideas, selecting profitable innovations and turning them into added value relates to that absolute time, a time specific to human existence. In this article I will try to prove that the main way to obtain high performance through inter-organizational innovation can be achieved by manipulating the context and manipulating knowledge outside the arbitrary concept for “time”. This article presents the results of the research suggesting a sequential analysis and evaluation model of the performance through a rational and refined process of selection of the performance indicators, aiming at providing the shortest and most relevant list of criteria.

  14. Politics of energy and the NEP: a framework and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Toner, G B

    1984-01-01

    This dissertation examines the nature and evolution of Canadian energy politics, with the focus on the 1973-1983 period and on the oil and gas aspects of energy. The conceptual basis for undertaking the analysis is development and application of an integrated framework for the study of energy politics in Canada. The introduction of the National Energy Program (NEP) by the federal Liberal government in October, 1980, marked a significant conjuncture in the development of Canadian energy politics. The NEP was intended to be a signal of a revitalized central government as well as bargaining stance in the ongoing price and revenue sharing negotiations. Thus, the NEP must be understood as first and foremost a political act. This research suggests that energy politics must be understood as the outcome of conflict and consensus within the government industry and intergovernmental relationships of power, over the ability to influence and control energy developments. To attempt to explain energy politics as essentially the outcome of interaction between government and industry with intergovernmental relations simply reflecting intra-industry competition, or conversely, to explain energy politics as merely the toing and froing of competing governments, is to present a fundamentally flawed portrayed of Canadian energy politics. That is, the dynamic force driving energy politics in Canada is a three-sided set of competitive relations between governments and the industry.

  15. Development of comprehensive and versatile framework for reactor analysis, MARBLE

    International Nuclear Information System (INIS)

    Yokoyama, Kenji; Hazama, Taira; Numata, Kazuyuki; Jin, Tomoyuki

    2014-01-01

    Highlights: • We have developed a neutronics code system for reactor analysis. • The new code system covers all five phases of the core design procedures. • All the functionalities are integrated and validated in the same framework. • The framework supports continuous improvement and extension. • We report results of validation and practical applications. - Abstract: A comprehensive and versatile reactor analysis code system, MARBLE, has been developed. MARBLE is designed as a software development framework for reactor analysis, which offers reusable and extendible functions and data models based on physical concepts, rather than a reactor analysis code system. From a viewpoint of the code system, it provides a set of functionalities utilized in a detailed reactor analysis scheme for fast criticality assemblies and power reactors, and nuclear data related uncertainty quantification such as cross-section adjustment. MARBLE includes five sub-systems named ECRIPSE, BIBLO, SCHEME, UNCERTAINTY and ORPHEUS, which are constructed of the shared functions and data models in the framework. By using these sub-systems, MARBLE covers all phases required in fast reactor core design prediction and improvement procedures, i.e. integral experiment database management, nuclear data processing, fast criticality assembly analysis, uncertainty quantification, and power reactor analysis. In the present paper, these functionalities are summarized and system validation results are described

  16. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  17. Translating policies into practice: a framework to prevent childhood obesity in afterschool programs.

    Science.gov (United States)

    Beets, Michael W; Webster, Collin; Saunders, Ruth; Huberty, Jennifer L

    2013-03-01

    Afterschool programs (3-6 p.m.) are positioned to play a critical role in combating childhood obesity. To this end, state and national organizations have developed policies related to promoting physical activity and guiding the nutritional quality of snacks served in afterschool programs. No conceptual frameworks, however, are available that describe the process of how afterschool programs will translate such policies into daily practice to reach eventual outcomes. Drawing from complex systems theory, this article describes the development of a framework that identifies critical modifiable levers within afterschool programs that can be altered and/or strengthened to reach policy goals. These include the policy environment at the national, state, and local levels; individual site, afterschool program leader, staff, and child characteristics; and existing outside organizational partnerships. Use of this framework and recognition of its constituent elements have the potential to lead to the successful and sustainable adoption and implementation of physical activity and nutrition policies in afterschool programs nationwide.

  18. Matlab programming for numerical analysis

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. Programming MATLAB for Numerical Analysis introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. You will first become

  19. R data analysis without programming

    CERN Document Server

    Gerbing, David W

    2013-01-01

    This book prepares readers to analyze data and interpret statistical results using R more quickly than other texts. R is a challenging program to learn because code must be created to get started. To alleviate that challenge, Professor Gerbing developed lessR. LessR extensions remove the need to program. By introducing R through less R, readers learn how to organize data for analysis, read the data into R, and produce output without performing numerous functions and programming exercises first. With lessR, readers can select the necessary procedure and change the relevant variables without pro

  20. Building Campus Communities Inclusive of International Students: A Framework for Program Development

    Science.gov (United States)

    Jameson, Helen Park; Goshit, Sunday

    2017-01-01

    This chapter provides readers with a practical, how-to approach and framework for developing inclusive, intercultural training programs for student affairs professionals on college campuses in the United States.

  1. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  2. Mississippi Curriculum Framework for Fashion Marketing Technology (Program CIP: 08.0101--Apparel and Accessories Mkt. Op., Gen.). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the fashion marketing technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…

  3. Mississippi Curriculum Framework for Veterinary Technology (Program CIP: 51.0808--Veterinarian Asst./Animal Health). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the veterinary technology program. Presented in the introductory section are a of the program and suggested course sequence. Section I lists baseline competencies, and section II consists of…

  4. A Comparative Analysis of Competency Frameworks for Youth Workers in the Out-of-School Time Field

    OpenAIRE

    Vance, Femi

    2010-01-01

    Research suggests that the quality of out-of-school time (OST) programs is related to positive youth outcomes and skilled staff are a critical component of high quality programming. This descriptive case study of competency frameworks for youth workers in the OST field demonstrates how experts and practitioners characterize a skilled youth worker. A comparative analysis of 11 competency frameworks is conducted to identify a set of common core competencies. A set of 12 competency areas that ar...

  5. Framework for a National Testing and Evaluation Program ...

    Science.gov (United States)

    Abstract:The National STEPP Program seeks to improve water quality by accelerating the effective implementation and adoption of innovative stormwater management technologies. Itwill attempt to accomplish this by establishing practices through highly reliable, and cost-effective Stormwater control measures (SCM) testing, evaluation, and verification services. The program will aim to remove barriers to innovation, minimize duplicative performance evaluation needs, increase confidence that regulatory requirements are met by creating consistency among testing and evaluation protocols, and establishing equity between public domain and proprietary SCM evaluation approaches.The Environmental Technology Verification Program, established by the U.S. Environmental Protection Agency (EPA) 18 years ago, was the only national program of its kindin the stormwater sector, but is now defunct, leaving a national leadership void. The STEPP initiative was triggered in part by regulatory demands in the government and private sectors to fill this vacuum. A concerted focus and study of this matter led to the release of a Water Environment Federation (WEF) white paper entitled “Investigation into the Feasibility of a National Testing and Evaluation Program for Stormwater Products and Practices” in February 2014. During this second phase of the STEPP initiative, and with EPA support, five analogous technology evaluation programs related to both stormwater and non-stormwater were an

  6. On the non-proliferation framework of Japan's peaceful nuclear utilization program

    International Nuclear Information System (INIS)

    Kano, Takashi

    1996-01-01

    The Conference of the States Party to the Treaty on the Non-proliferation of Nuclear Weapons (hereinafter referred to as the NPT) convened in New York, from April 17 to May 12, 1995 and decided that the NPT shall continue in force indefinitely, after reviewing the operation and affirming some aspects of the NPT, while emphasizing the ''Decision on Strengthening the Review Process'' for the NPT and the ''Decision on Principles and Objectives for Nuclear Non-proliferation and Disarmament,'' also adopted by the Conference. In parallel, Japan made its basic non-proliferation policy clear in the ''Long-Term Program for Research, Development and Utilization of Nuclear Energy'' which was decided by the Atomic Energy Commission (chaired by Mikio Oomi, then Minister of the Science and Technology Agency of Japan) in June 1994. The Long-Term Program discusses various problems facing post-Cold-War international society and describes Japan's policy for establishing international confidence concerning non-proliferation. This paper summarizes Japan's non-proliferation policy as articulated in the Long-Term Program, and describes some results of an analysis comparing the Long-Term Program with the resolutions on the international non-proliferation frameworks adopted by the NPT conference

  7. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  8. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  9. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    opportunities, a generic modelling framework is proposed to handle this task. This framework outlines a set of building blocks which are necessary for carrying out the economic analysis of various BS applications. Further, special focus is given on describing how to use the rainflow cycle counting algorithm...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so......Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...

  10. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  11. A Probabilistic Analysis Framework for Malicious Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Kammuller, Florian; Nemli, Ibrahim

    2015-01-01

    Malicious insider threats are difficult to detect and to mitigate. Many approaches for explaining behaviour exist, but there is little work to relate them to formal approaches to insider threat detection. In this work we present a general formal framework to perform analysis for malicious insider...

  12. A comparative analysis of protected area planning and management frameworks

    Science.gov (United States)

    Per Nilsen; Grant Tayler

    1997-01-01

    A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...

  13. Agricultural Value Chains in Developing Countries; a Framework for Analysis

    NARCIS (Netherlands)

    Trienekens, J.H.

    2011-01-01

    The paper presents a framework for developing country value chain analysis made up of three components. The first consists of identifying major constraints for value chain upgrading: market access restrictions, weak infrastructures, lacking resources and institutional voids. In the second component

  14. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  15. Generic Formal Framework for Compositional Analysis of Hierarchical Scheduling Systems

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; Hyun Kim, Jin; Thi Xuan Phan, Linh

    We present a compositional framework for the specification and analysis of hierarchical scheduling systems (HSS). Firstly we provide a generic formal model, which can be used to describe any type of scheduling system. The concept of Job automata is introduced in order to model job instantiation...

  16. A framework for monitoring social process and outcomes in environmental programs.

    Science.gov (United States)

    Chapman, Sarah

    2014-12-01

    When environmental programs frame their activities as being in the service of human wellbeing, social variables need to be integrated into monitoring and evaluation (M&E) frameworks. This article draws upon ecosystem services theory to develop a framework to guide the M&E of collaborative environmental programs with anticipated social benefits. The framework has six components: program need, program activities, pathway process variables, moderating process variables, outcomes, and program value. Needs are defined in terms of ecosystem services, as well as other human needs that must be addressed to achieve outcomes. The pathway variable relates to the development of natural resource governance capacity in the target community. Moderating processes can be externalities such as the inherent capacity of the natural system to service ecosystem needs, local demand for natural resources, policy or socio-economic drivers. Internal program-specific processes relate to program service delivery, targeting and participant responsiveness. Ecological outcomes are expressed in terms of changes in landscape structure and function, which in turn influence ecosystem service provision. Social benefits derived from the program are expressed in terms of the value of the eco-social service to user-specified goals. The article provides suggestions from the literature for identifying indicators and measures for components and component variables, and concludes with an example of how the framework was used to inform the M&E of an adaptive co-management program in western Kenya. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Transactional Analysis: Conceptualizing a Framework for Illuminating Human Experience

    Directory of Open Access Journals (Sweden)

    Trevor Thomas Stewart PhD

    2011-09-01

    Full Text Available Myriad methods exist for analyzing qualitative data. It is, however, imperative for qualitative researchers to employ data analysis tools that are congruent with the theoretical frameworks underpinning their inquiries. In this paper, I have constructed a framework for analyzing data that could be useful for researchers interested in focusing on the transactional nature of language as they engage in Social Science research. Transactional Analysis (TA is an inductive approach to data analysis that transcends constant comparative methods of exploring data. Drawing on elements of narrative and thematic analysis, TA uses the theories of Bakhtin and Rosenblatt to attend to the dynamic processes researchers identify as they generate themes in their data and seek to understand how their participants' worldviews are being shaped. This paper highlights the processes researchers can utilize to study the mutual shaping that occurs as participants read and enter into dialogue with the world around them.

  18. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  19. Programming-Languages as a Conceptual Framework for Teaching Mathematics

    Science.gov (United States)

    Feurzeig, Wallace; Papert, Seymour A.

    2011-01-01

    Formal mathematical methods remain, for most high school students, mysterious, artificial and not a part of their regular intuitive thinking. The authors develop some themes that could lead to a radically new approach. According to this thesis, the teaching of programming languages as a regular part of academic progress can contribute effectively…

  20. Planetary protection in the framework of the Aurora exploration program

    Science.gov (United States)

    Kminek, G.

    The Aurora Exploration Program will give ESA new responsibilities in the field of planetary protection. Until now, ESA had only limited exposure to planetary protection from its own missions. With the proposed ExoMars and MSR missions, however, ESA will enter the realm of the highest planetary protection categories. As a consequence, the Aurora Exploration Program has initiated a number of activities in the field of planetary protection. The first and most important step was to establish a Planetary Protection Working Group (PPWG) that is advising the Exploration Program Advisory Committee (EPAC) on all matters concerning planetary protection. The main task of the PPWG is to provide recommendations regarding: Planetary protection for robotic missions to Mars; Planetary protection for a potential human mission to Mars; Review/evaluate standards & procedures for planetary protection; Identify research needs in the field of planetary protection. As a result of the PPWG deliberations, a number of activities have been initiated: Evaluation of the Microbial Diversity in SC Facilities; Working paper on legal issues of planetary protection and astrobiology; Feasibility study on a Mars Sample Return Containment Facility; Research activities on sterilization procedures; Training course on planetary protection (May, 2004); Workshop on sterilization techniques (fall 2004). In parallel to the PPWG, the Aurora Exploration Program has established an Ethical Working Group (EWG). This working group will address ethical issues related to astrobiology, planetary protection, and manned interplanetary missions. The recommendations of the working groups and the results of the R&D activities form the basis for defining planetary protection specification for Aurora mission studies, and for proposing modification and new inputs to the COSPAR planetary protection policy. Close cooperation and free exchange of relevant information with the NASA planetary protection program is strongly

  1. Object-oriented data analysis framework for neutron scattering experiments

    International Nuclear Information System (INIS)

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  2. A framework for evaluating and designing citizen science programs for natural resources monitoring.

    Science.gov (United States)

    Chase, Sarah K; Levine, Arielle

    2016-06-01

    We present a framework of resource characteristics critical to the design and assessment of citizen science programs that monitor natural resources. To develop the framework we reviewed 52 citizen science programs that monitored a wide range of resources and provided insights into what resource characteristics are most conducive to developing citizen science programs and how resource characteristics may constrain the use or growth of these programs. We focused on 4 types of resource characteristics: biophysical and geographical, management and monitoring, public awareness and knowledge, and social and cultural characteristics. We applied the framework to 2 programs, the Tucson (U.S.A.) Bird Count and the Maui (U.S.A.) Great Whale Count. We found that resource characteristics such as accessibility, diverse institutional involvement in resource management, and social or cultural importance of the resource affected program endurance and success. However, the relative influence of each characteristic was in turn affected by goals of the citizen science programs. Although the goals of public engagement and education sometimes complimented the goal of collecting reliable data, in many cases trade-offs must be made between these 2 goals. Program goals and priorities ultimately dictate the design of citizen science programs, but for a program to endure and successfully meet its goals, program managers must consider the diverse ways that the nature of the resource being monitored influences public participation in monitoring. © 2016 Society for Conservation Biology.

  3. An Analysis of Programming Beginners' Source Programs

    Science.gov (United States)

    Matsuyama, Chieko; Nakashima, Toyoshiro; Ishii, Naohiro

    The production of animations was made the subject of a university programming course in order to make students understand the process of program creation, and so that students could tackle programming with interest. In this paper, the formats and composition of the programs which students produced were investigated. As a result, it was found that there were a lot of problems related to such matters as how to use indent, how to apply comments and functions etc. for the format and the composition of the source codes.

  4. MOOC Success Factors: Proposal of an Analysis Framework

    Directory of Open Access Journals (Sweden)

    Margarida M. Marques

    2017-10-01

    Full Text Available Aim/Purpose: From an idea of lifelong-learning-for-all to a phenomenon affecting higher education, Massive Open Online Courses (MOOCs can be the next step to a truly universal education. Indeed, MOOC enrolment rates can be astoundingly high; still, their completion rates are frequently disappointingly low. Nevertheless, as courses, the participants’ enrolment and learning within the MOOCs must be considered when assessing their success. In this paper, the authors’ aim is to reflect on what makes a MOOC successful to propose an analysis framework of MOOC success factors. Background: A literature review was conducted to identify reported MOOC success factors and to propose an analysis framework. Methodology: This literature-based framework was tested against data of a specific MOOC and refined, within a qualitative interpretivist methodology. The data were collected from the ‘As alterações climáticas nos média escolares - Clima@EduMedia’ course, which was developed by the project Clima@EduMedia and was submitted to content analysis. This MOOC aimed to support science and school media teachers in the use of media to teach climate change Contribution: By proposing a MOOC success factors framework the authors are attempting to contribute to fill in a literature gap regarding what concerns criteria to consider a specific MOOC successful. Findings: This work major finding is a literature-based and empirically-refined MOOC success factors analysis framework. Recommendations for Practitioners: The proposed framework is also a set of best practices relevant to MOOC developers, particularly when targeting teachers as potential participants. Recommendation for Researchers: This work’s relevance is also based on its contribution to increasing empirical research on MOOCs. Impact on Society: By providing a proposal of a framework on factors to make a MOOC successful, the authors hope to contribute to the quality of MOOCs. Future Research: Future

  5. Combinatorial-topological framework for the analysis of global dynamics

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  6. Combinatorial-topological framework for the analysis of global dynamics.

    Science.gov (United States)

    Bush, Justin; Gameiro, Marcio; Harker, Shaun; Kokubu, Hiroshi; Mischaikow, Konstantin; Obayashi, Ippei; Pilarczyk, Paweł

    2012-12-01

    We discuss an algorithmic framework based on efficient graph algorithms and algebraic-topological computational tools. The framework is aimed at automatic computation of a database of global dynamics of a given m-parameter semidynamical system with discrete time on a bounded subset of the n-dimensional phase space. We introduce the mathematical background, which is based upon Conley's topological approach to dynamics, describe the algorithms for the analysis of the dynamics using rectangular grids both in phase space and parameter space, and show two sample applications.

  7. A Decision Support Framework for Feasibility Analysis of International Space Station (ISS) Research Capability Enhancing Options

    Science.gov (United States)

    Ortiz, James N.; Scott,Kelly; Smith, Harold

    2004-01-01

    The assembly and operation of the ISS has generated significant challenges that have ultimately impacted resources available to the program's primary mission: research. To address this, program personnel routinely perform trade-off studies on alternative options to enhance research. The approach, content level of analysis and resulting outputs of these studies vary due to many factors, however, complicating the Program Manager's job of selecting the best option. To address this, the program requested a framework be developed to evaluate multiple research-enhancing options in a thorough, disciplined and repeatable manner, and to identify the best option on the basis of cost, benefit and risk. The resulting framework consisted of a systematic methodology and a decision-support toolset. The framework provides quantifiable and repeatable means for ranking research-enhancing options for the complex and multiple-constraint domain of the space research laboratory. This paper describes the development, verification and validation of this framework and provides observations on its operational use.

  8. Framework for Interactive Parallel Dataset Analysis on the Grid

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, David A.; Ananthan, Balamurali; /Tech-X Corp.; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  9. Mississippi Curriculum Framework for Dental Hygiene Technology (Program CIP: 51.0602--Dental Hygienist). Postsecondary Education.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the dental hygiene technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies. Section II…

  10. The Data-to-Action Framework: A Rapid Program Improvement Process

    Science.gov (United States)

    Zakocs, Ronda; Hill, Jessica A.; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E.

    2015-01-01

    Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to…

  11. Detecting spatial patterns of rivermouth processes using a geostatistical framework for near-real-time analysis

    Science.gov (United States)

    Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara

    2017-01-01

    This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.

  12. Coordination of Programs on Domestic Animal Genomics: The Federal Framework

    Science.gov (United States)

    2004-06-01

    programs, especially for dairy and beef cattle, layer and broiler chickens, and swine. As DNA-based technologies were developed in the 1970s and 1980s...Wellcome Trust in Great Britain, has created a map of genetic variation for three different strains of domestic chickens. The strains were a broiler strain...present the biggest problem as the status of asymptotic carriers may not be suspected until several litters have been produced. This includes diseases

  13. Analysis of Worldwide Regulatory Framework for On-Line Maintenance

    International Nuclear Information System (INIS)

    Ahn, Sang Kyu; Oh, Kyu Myung; Lee, Chang Ju

    2010-01-01

    With the increasing economic pressures being faced and the potential for shortening outage times under the conditions of deregulated electricity markets in the world, licensees are motivated to get an increasing amount of online maintenance (OLM). OLM means a kind of planned maintenance of nuclear reactor facilities, including structure, systems, and components (SSCs), during power operation. In Korea, a similar situation is made up, so it needs to establish a regulatory framework for OLM. A few years ago, foreign countries' practices related to OLM were surveyed by the Working Group on Inspection Practices (WGIP) of OECD/NEA/CNRA. The survey results and additional new information of countries' status will be helpful to establish our own regulatory framework for OLM, which are analyzed in this paper. From the analysis, some considerable points to be addressed for establishing a regulatory framework for OLM are suggested

  14. A Program Transformation for Backwards Analysis of Logic Programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2003-01-01

    The input to backwards analysis is a program together with properties that are required to hold at given program points. The purpose of the analysis is to derive initial goals or pre-conditions that guarantee that, when the program is executed, the given properties hold. The solution for logic...... programs presented here is based on a transformation of the input program, which makes explicit the dependencies of the given program points on the initial goals. The transformation is derived from the resultants semantics of logic programs. The transformed program is then analysed using a standard...

  15. Framework for assessing causality in disease management programs: principles.

    Science.gov (United States)

    Wilson, Thomas; MacDowell, Martin

    2003-01-01

    To credibly state that a disease management (DM) program "caused" a specific outcome it is required that metrics observed in the DM population be compared with metrics that would have been expected in the absence of a DM intervention. That requirement can be very difficult to achieve, and epidemiologists and others have developed guiding principles of causality by which credible estimates of DM impact can be made. This paper introduces those key principles. First, DM program metrics must be compared with metrics from a "reference population." This population should be "equivalent" to the DM intervention population on all factors that could independently impact the outcome. In addition, the metrics used in both groups should use the same defining criteria (ie, they must be "comparable" to each other). The degree to which these populations fulfill the "equivalent" assumption and metrics fulfill the "comparability" assumption should be stated. Second, when "equivalence" or "comparability" is not achieved, the DM managers should acknowledge this fact and, where possible, "control" for those factors that may impact the outcome(s). Finally, it is highly unlikely that one study will provide definitive proof of any specific DM program value for all time; thus, we strongly recommend that studies be ongoing, at multiple points in time, and at multiple sites, and, when observational study designs are employed, that more than one type of study design be utilized. Methodologically sophisticated studies that follow these "principles of causality" will greatly enhance the reputation of the important and growing efforts in DM.

  16. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    Science.gov (United States)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  17. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    Directory of Open Access Journals (Sweden)

    Paweł Sitek

    2016-01-01

    Full Text Available This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs and constraint optimization problems (COPs. Two paradigms, CLP (constraint logic programming and MP (mathematical programming, are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework automatically generates CSP and COP models based on current values of data instances, questions asked by a user, and set of predicates and facts of the problem being modeled, which altogether constitute a knowledge database for the given problem. This dynamic generation of dedicated models, based on the knowledge base, together with the parameters changing externally, for example, the user’s questions, is the implementation of the autonomous search concept. The models are solved using the internal or external solvers integrated with the framework. The architecture of the framework as well as its implementation outline is also included in the paper. The effectiveness of the framework regarding the modeling and solution search is assessed through the illustrative examples relating to scheduling problems with additional constrained resources.

  18. A programming framework for data streaming on the Xeon Phi

    Science.gov (United States)

    Chapeland, S.; ALICE Collaboration

    2017-10-01

    ALICE (A Large Ion Collider Experiment) is the dedicated heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). After the second long shut-down of the LHC, the ALICE detector will be upgraded to cope with an interaction rate of 50 kHz in Pb-Pb collisions, producing in the online computing system (O2) a sustained throughput of 3.4 TB/s. This data will be processed on the fly so that the stream to permanent storage does not exceed 90 GB/s peak, the raw data being discarded. In the context of assessing different computing platforms for the O2 system, we have developed a framework for the Intel Xeon Phi processors (MIC). It provides the components to build a processing pipeline streaming the data from the PC memory to a pool of permanent threads running on the MIC, and back to the host after processing. It is based on explicit offloading mechanisms (data transfer, asynchronous tasks) and basic building blocks (FIFOs, memory pools, C++11 threads). The user only needs to implement the processing method to be run on the MIC. We present in this paper the architecture, implementation, and performance of this system.

  19. A Synthetic Biology Framework for Programming Eukaryotic Transcription Functions

    Science.gov (United States)

    Khalil, Ahmad S.; Lu, Timothy K.; Bashor, Caleb J.; Ramirez, Cherie L.; Pyenson, Nora C.; Joung, J. Keith; Collins, James J.

    2013-01-01

    SUMMARY Eukaryotic transcription factors (TFs) perform complex and combinatorial functions within transcriptional networks. Here, we present a synthetic framework for systematically constructing eukaryotic transcription functions using artificial zinc fingers, modular DNA-binding domains found within many eukaryotic TFs. Utilizing this platform, we construct a library of orthogonal synthetic transcription factors (sTFs) and use these to wire synthetic transcriptional circuits in yeast. We engineer complex functions, such as tunable output strength and transcriptional cooperativity, by rationally adjusting a decomposed set of key component properties, e.g., DNA specificity, affinity, promoter design, protein-protein interactions. We show that subtle perturbations to these properties can transform an individual sTF between distinct roles (activator, cooperative factor, inhibitory factor) within a transcriptional complex, thus drastically altering the signal processing behavior of multi-input systems. This platform provides new genetic components for synthetic biology and enables bottom-up approaches to understanding the design principles of eukaryotic transcriptional complexes and networks. PMID:22863014

  20. Storing Clocked Programs Inside DNA A Simplifying Framework for Nanocomputing

    CERN Document Server

    Chang, Jessica

    2011-01-01

    In the history of modern computation, large mechanical calculators preceded computers. A person would sit there punching keys according to a procedure and a number would eventually appear. Once calculators became fast enough, it became obvious that the critical path was the punching rather than the calculation itself. That is what made the stored program concept vital to further progress. Once the instructions were stored in the machine, the entire computation could run at the speed of the machine. This book shows how to do the same thing for DNA computing. Rather than asking a robot or a pers

  1. A versatile Moessbauer analysis program

    International Nuclear Information System (INIS)

    Jernberg, P.; Sundqvist, T.

    1983-06-01

    MDA - Moessbauer Data Analysis, is a user oriented computer program, aiming to simulate a Moessbauer transmission spectrum, given by a set of parameters, and compare it with experimental data. The calculation considers a number of experimental situations and the comparisons can be made by least squares sums or by plotting the simulated and the measured spectrum. A fitting routine, minimizing the least squares sum, can be used to find the parameters characterizing the measured spectrum.(author)

  2. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  3. The radiation protection research within the fourth Framework Program of the European Union (1994-1998)

    International Nuclear Information System (INIS)

    Siunaeve, J.; Mingot, F.; Arranz, L.; Cancio, D.

    1995-01-01

    The next research program on Radiation Protection within the Fourth Framework Program of the European Union has been approved by the Council last December (O.I.N L 361, 12/31/94). The program includes important changes in its structure as well as in the way for implementation in Europe. The most important change is that the main activities concerning Nuclear Safety, Waste Management and Radiation Protection have been included in a single program called Nuclear Fission Safety. The program also includes specific work with CIS countries for the management of Chernobyl consequences as well as other significative contaminations in other areas of the former Soviet Union. (Author)

  4. Stochastic programming framework for Lithuanian pension payout modelling

    Directory of Open Access Journals (Sweden)

    Audrius Kabašinskas

    2014-12-01

    Full Text Available The paper provides a scientific approach to the problem of selecting a pension fund by taking into account some specific characteristics of the Lithuanian Republic (LR pension accumulation system. The decision making model, which can be used to plan a long-term pension accrual of the Lithuanian Republic (LR citizens, in an optimal way is presented. This model focuses on factors that influence the sustainability of the pension system selection under macroeconomic, social and demographic uncertainty. The model is formalized as a single stage stochastic optimization problem where the long-term optimal strategy can be obtained based on the possible scenarios generated for a particular participant. Stochastic programming methods allow including the pension fund rebalancing moment and direction of investment, and taking into account possible changes of personal income, changes of society and the global financial market. The collection of methods used to generate scenario trees was found useful to solve strategic planning problems.

  5. Establishing a regulatory framework for a RCRA corrective action program

    International Nuclear Information System (INIS)

    Krueger, J.W.

    1989-01-01

    Recently, the environmental community has become keenly aware of problems associated with integration of the demanding regulations that apply to environmental restoration activities. Once can not attend an EPA-sponsored conference on Superfund without hearing questions concerning the Resource, Conservation, and Recovery Act (RCRA) and the applicability of the National Contingency Plan (NCP) to sites that do not qualify for the National Priorities List (NPL). In particular, the U.S. Department of Energy (DOE) has been greatly criticized for its inability to define a comprehensive approach for cleaning up its hazardous waste sites. This article presents two decision flowcharts designed to resolve some of this confusion for DOE. The RCRA/CERCLA integration diagram can help the environmental manager determine which law applies and under what conditions, and the RCRA corrective action decision flowchart can guide the manager in determining which specific sections of RCRA apply to a RCRA-lead environmental restoration program

  6. Second generation CO2 FEP analysis: Cassifcarbon sequestration scenario identification framework

    NARCIS (Netherlands)

    Yavuz, F.T.; Tilburg, T. van; Pagnier, H.

    2008-01-01

    A novel scenario analysis framework has been created, called Carbon Sequestration Scenario Identification Framework (CASSIF). This framework addresses containment performance defined by the three major categories: well, fault and seal integrity. The relevant factors that influence the integrity are

  7. Developing a Framework for Evaluating Organizational Information Assurance Metrics Programs

    Science.gov (United States)

    2007-03-01

    of analysis, which  is one of the stated goals for this research ( Miles  and  Huberman , 1994, p.10; Yin,  2003).    Benbasat et al. (1987) found “case...Electronic version].        Michalewicz, Z., and Fogel, D. B. (2002).  How to Solve It:  Modern Heuristics.   Berlin: Springer‐Verlag.    Miles , M. B...and  Huberman , A. M. (1994).  Qualitative Data Analysis:  An  Expanded Sourcebook.  Thousand Oaks: Sage Publications.    Miller, R. L., and Hope, S. A

  8. Environmental risk analysis for nanomaterials: Review and evaluation of frameworks

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    2012-01-01

    to occupational settings with minor environmental considerations, and most have not been thoroughly tested on a wide range of NM. Care should also be taken when selecting the most appropriate risk analysis strategy for a given risk context. Given this, we recommend a multi-faceted approach to assess...... the environmental risks of NM as well as increased applications and testing of the proposed frameworks for different NM....

  9. Subseabed-disposal program: systems-analysis program plan

    International Nuclear Information System (INIS)

    Klett, R.D.

    1981-03-01

    This report contains an overview of the Subseabed Nuclear Waste Disposal Program systems analysis program plan, and includes sensitivity, safety, optimization, and cost/benefit analyses. Details of the primary barrier sensitivity analysis and the data acquisition and modeling cost/benefit studies are given, as well as the schedule through the technical, environmental, and engineering feasibility phases of the program

  10. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  11. NET-2 Network Analysis Program

    International Nuclear Information System (INIS)

    Malmberg, A.F.

    1974-01-01

    The NET-2 Network Analysis Program is a general purpose digital computer program which solves the nonlinear time domain response and the linearized small signal frequency domain response of an arbitrary network of interconnected components. NET-2 is capable of handling a variety of components and has been applied to problems in several engineering fields, including electronic circuit design and analysis, missile flight simulation, control systems, heat flow, fluid flow, mechanical systems, structural dynamics, digital logic, communications network design, solid state device physics, fluidic systems, and nuclear vulnerability due to blast, thermal, gamma radiation, neutron damage, and EMP effects. Network components may be selected from a repertoire of built-in models or they may be constructed by the user through appropriate combinations of mathematical, empirical, and topological functions. Higher-level components may be defined by subnetworks composed of any combination of user-defined components and built-in models. The program provides a modeling capability to represent and intermix system components on many levels, e.g., from hole and electron spatial charge distributions in solid state devices through discrete and integrated electronic components to functional system blocks. NET-2 is capable of simultaneous computation in both the time and frequency domain, and has statistical and optimization capability. Network topology may be controlled as a function of the network solution. (U.S.)

  12. A new kernel discriminant analysis framework for electronic nose recognition

    International Nuclear Information System (INIS)

    Zhang, Lei; Tian, Feng-Chun

    2014-01-01

    Graphical abstract: - Highlights: • This paper proposes a new discriminant analysis framework for feature extraction and recognition. • The principle of the proposed NDA is derived mathematically. • The NDA framework is coupled with kernel PCA for classification. • The proposed KNDA is compared with state of the art e-Nose recognition methods. • The proposed KNDA shows the best performance in e-Nose experiments. - Abstract: Electronic nose (e-Nose) technology based on metal oxide semiconductor gas sensor array is widely studied for detection of gas components. This paper proposes a new discriminant analysis framework (NDA) for dimension reduction and e-Nose recognition. In a NDA, the between-class and the within-class Laplacian scatter matrix are designed from sample to sample, respectively, to characterize the between-class separability and the within-class compactness by seeking for discriminant matrix to simultaneously maximize the between-class Laplacian scatter and minimize the within-class Laplacian scatter. In terms of the linear separability in high dimensional kernel mapping space and the dimension reduction of principal component analysis (PCA), an effective kernel PCA plus NDA method (KNDA) is proposed for rapid detection of gas mixture components by an e-Nose. The NDA framework is derived in this paper as well as the specific implementations of the proposed KNDA method in training and recognition process. The KNDA is examined on the e-Nose datasets of six kinds of gas components, and compared with state of the art e-Nose classification methods. Experimental results demonstrate that the proposed KNDA method shows the best performance with average recognition rate and total recognition rate as 94.14% and 95.06% which leads to a promising feature extraction and multi-class recognition in e-Nose

  13. Interactive Safety Analysis Framework of Autonomous Intelligent Vehicles

    Directory of Open Access Journals (Sweden)

    Cui You Xiang

    2016-01-01

    Full Text Available More than 100,000 people were killed and around 2.6 million injured in road accidents in the People’s Republic of China (PRC, that is four to eight times that of developed countries, equivalent to 6.2 mortality per 10 thousand vehicles—the highest rate in the world. There are more than 1,700 fatalities and 840,000 injuries yearly due to vehicle crashes off public highways. In this paper, we proposed a interactive safety situation and threat analysis framework based on driver behaviour and vehicle dynamics risk analysis based on ISO26262…

  14. Flexible Human Behavior Analysis Framework for Video Surveillance Applications

    Directory of Open Access Journals (Sweden)

    Weilun Lao

    2010-01-01

    Full Text Available We study a flexible framework for semantic analysis of human motion from surveillance video. Successful trajectory estimation and human-body modeling facilitate the semantic analysis of human activities in video sequences. Although human motion is widely investigated, we have extended such research in three aspects. By adding a second camera, not only more reliable behavior analysis is possible, but it also enables to map the ongoing scene events onto a 3D setting to facilitate further semantic analysis. The second contribution is the introduction of a 3D reconstruction scheme for scene understanding. Thirdly, we perform a fast scheme to detect different body parts and generate a fitting skeleton model, without using the explicit assumption of upright body posture. The extension of multiple-view fusion improves the event-based semantic analysis by 15%–30%. Our proposed framework proves its effectiveness as it achieves a near real-time performance (13–15 frames/second and 6–8 frames/second for monocular and two-view video sequences.

  15. Planetary Protection Bioburden Analysis Program

    Science.gov (United States)

    Beaudet, Robert A.

    2013-01-01

    This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous tools that report the data in various ways to simplify the reports required. The program performs all the calculations directly in the MS Access program. Prior to this development, the data was exported to large Excel files that had to be cut and pasted to provide the desired results. The program contains a main menu and a number of submenus. Analyses can be performed by using either all the assays, or only the accountable assays that will be used in the final analysis. There are three options on the first menu: either calculate using (1) the old MER (Mars Exploration Rover) statistics, (2) the MSL statistics for all the assays, or This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software

  16. The PandaRoot framework for simulation, reconstruction and analysis

    International Nuclear Information System (INIS)

    Spataro, Stefano

    2011-01-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  17. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  18. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  19. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  20. XML Graphs in Program Analysis

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael Ignatieff

    2007-01-01

    XML graphs have shown to be a simple and effective formalism for representing sets of XML documents in program analysis. It has evolved through a six year period with variants tailored for a range of applications. We present a unified definition, outline the key properties including validation...... of XML graphs against different XML schema languages, and provide a software package that enables others to make use of these ideas. We also survey four very different applications: XML in Java, Java Servlets and JSP, transformations between XML and non-XML data, and XSLT....

  1. Safeguard Vulnerability Analysis Program (SVAP)

    International Nuclear Information System (INIS)

    Gilman, F.M.; Dittmore, M.H.; Orvis, W.J.; Wahler, P.S.

    1980-01-01

    This report gives an overview of the Safeguard Vulnerability Analysis Program (SVAP) developed at Lawrence Livermore National Laboratory. SVAP was designed as an automated method of analyzing the safeguard systems at nuclear facilities for vulnerabilities relating to the theft or diversion of nuclear materials. SVAP addresses one class of safeguard threat: theft or diversion of nuclear materials by nonviolent insiders, acting individually or in collusion. SVAP is a user-oriented tool which uses an interactive input medium for preprocessing the large amounts of safeguards data. Its output includes concise summary data as well as detailed vulnerability information

  2. An ovine in vivo framework for tracheobronchial stent analysis.

    Science.gov (United States)

    McGrath, Donnacha J; Thiebes, Anja Lena; Cornelissen, Christian G; O'Shea, Mary B; O'Brien, Barry; Jockenhoevel, Stefan; Bruzzi, Mark; McHugh, Peter E

    2017-10-01

    Tracheobronchial stents are most commonly used to restore patency to airways stenosed by tumour growth. Currently all tracheobronchial stents are associated with complications such as stent migration, granulation tissue formation, mucous plugging and stent strut fracture. The present work develops a computational framework to evaluate tracheobronchial stent designs in vivo. Pressurised computed tomography is used to create a biomechanical lung model which takes into account the in vivo stress state, global lung deformation and local loading from pressure variation. Stent interaction with the airway is then evaluated for a number of loading conditions including normal breathing, coughing and ventilation. Results of the analysis indicate that three of the major complications associated with tracheobronchial stents can potentially be analysed with this framework, which can be readily applied to the human case. Airway deformation caused by lung motion is shown to have a significant effect on stent mechanical performance, including implications for stent migration, granulation formation and stent fracture.

  3. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  4. Mississippi Curriculum Framework for Drafting and Design Technology (Program CIP: 48.0102--Architectural Drafting Technology) (Program CIP: 48.0101--General Drafting). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the two course sequences of the state's postsecondary-level drafting and design technology program: architectural drafting technology and drafting and design technology. Presented first are a program description and…

  5. Effects of donor proliferation in development aid for health on health program performance: A conceptual framework.

    Science.gov (United States)

    Pallas, Sarah Wood; Ruger, Jennifer Prah

    2017-02-01

    Development aid for health increased dramatically during the past two decades, raising concerns about inefficiency and lack of coherence among the growing number of global health donors. However, we lack a framework for how donor proliferation affects health program performance to inform theory-based evaluation of aid effectiveness policies. A review of academic and gray literature was conducted. Data were extracted from the literature sample on study design and evidence for hypothesized effects of donor proliferation on health program performance, which were iteratively grouped into categories and mapped into a new conceptual framework. In the framework, increases in the number of donors are hypothesized to increase inter-donor competition, transaction costs, donor poaching of recipient staff, recipient control over aid, and donor fragmentation, and to decrease donors' sense of accountability for overall development outcomes. There is mixed evidence on whether donor proliferation increases or decreases aid volume. These primary effects in turn affect donor innovation, information hoarding, and aid disbursement volatility, as well as recipient country health budget levels, human resource capacity, and corruption, and the determinants of health program performance. The net effect of donor proliferation on health will vary depending on the magnitude of the framework's competing effects in specific country settings. The conceptual framework provides a foundation for improving design of aid effectiveness practices to mitigate negative effects from donor proliferation while preserving its potential benefits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  7. VisRseq: R-based visual framework for analysis of sequencing data.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  8. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  9. 77 FR 11785 - Energy Conservation Program: Public Meeting and Availability of the Framework Document for High...

    Science.gov (United States)

    2012-02-28

    ... standards for high-intensity discharge (HID) lamps. Accordingly, DOE will hold a public meeting to discuss..._standards/commercial/high_intensity_discharge_lamps.html . DATES: The Department will hold a public meeting... Technologies Program, Mailstop EE-2J, Framework Document for High-Intensity Discharge Lamps, EERE-2010-BT-STD...

  10. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    International Nuclear Information System (INIS)

    Alioli, Simone; Nason, Paolo; Oleari, Carlo; Re, Emanuele

    2010-02-01

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  11. Introduction of blended learning in a master program: Developing an integrative mixed method evaluation framework.

    Science.gov (United States)

    Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K

    2017-01-01

    The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Exploring intellectual capital through social network analysis: a conceptual framework

    Directory of Open Access Journals (Sweden)

    Ivana Tichá

    2011-01-01

    Full Text Available The purpose of this paper is to develop a framework to assess intellectual capital. Intellectual capital is a key element in an organization’s future earning potential. Theoretical and empirical studies show that it is the unique combination of the different elements of intellectual capital and tangible investments that determines an enterprise´s competitive advantage. Intellectual capital has been defined as the combination of an organization´s human, organizational and relational resources and activities. It includes the knowledge, skills, experience and abilities of the employees, its R&D activities, organizational, routines, procedures, systems, databases and its Intellectual Property Rights, as well as all the resources linked to its external relationships, such as with its customers, suppliers, R&D partners, etc. This paper focuses on the relational capital and attempts to suggest a conceptual framework to assess this part of intellectual capital applying social network analysis approach. The SNA approach allows for mapping and measuring of relationships and flows between, people, groups, organizations, computers, URLs, and other connected information/knowledge entities. The conceptual framework is developed for the assessment of collaborative networks in the Czech higher education sector as the representation of its relational capital. It also builds on the previous work aiming at proposal of methodology guiding efforts to report intellectual capital at the Czech public universities.

  13. A Framework for Security Analysis of Mobile Wireless Networks

    DEFF Research Database (Denmark)

    Nanz, Sebastian; Hankin, Chris

    2006-01-01

    processes and the network's connectivity graph, which may change independently from protocol actions. We identify a property characterising an important aspect of security in this setting and express it using behavioural equivalences of the calculus. We complement this approach with a control flow analysis......We present a framework for specification and security analysis of communication protocols for mobile wireless networks. This setting introduces new challenges which are not being addressed by classical protocol analysis techniques. The main complication stems from the fact that the actions...... of intermediate nodes and their connectivity can no longer be abstracted into a single unstructured adversarial environment as they form an inherent part of the system's security. In order to model this scenario faithfully, we present a broadcast calculus which makes a clear distinction between the protocol...

  14. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  15. A framework for sensitivity analysis of decision trees.

    Science.gov (United States)

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  16. ASAP- ARTIFICIAL SATELLITE ANALYSIS PROGRAM

    Science.gov (United States)

    Kwok, J.

    1994-01-01

    The Artificial Satellite Analysis Program (ASAP) is a general orbit prediction program which incorporates sufficient orbit modeling accuracy for mission design, maneuver analysis, and mission planning. ASAP is suitable for studying planetary orbit missions with spacecraft trajectories of reconnaissance (flyby) and exploratory (mapping) nature. Sample data is included for a geosynchronous station drift cycle study, a Venus radar mapping strategy, a frozen orbit about Mars, and a repeat ground trace orbit. ASAP uses Cowell's method in the numerical integration of the equations of motion. The orbital mechanics calculation contains perturbations due to non-sphericity (up to a 40 X 40 field) of the planet, lunar and solar effects, and drag and solar radiation pressure. An 8th order Runge-Kutta integration scheme with variable step size control is used for efficient propagation. The input includes the classical osculating elements, orbital elements of the sun relative to the planet, reference time and dates, drag coefficient, gravitational constants, and planet radius, rotation rate, etc. The printed output contains Cartesian coordinates, velocity, equinoctial elements, and classical elements for each time step or event step. At each step, selected output is added to a plot file. The ASAP package includes a program for sorting this plot file. LOTUS 1-2-3 is used in the supplied examples to graph the results, but any graphics software package could be used to process the plot file. ASAP is not written to be mission-specific. Instead, it is intended to be used for most planetary orbiting missions. As a consequence, the user has to have some basic understanding of orbital mechanics to provide the correct input and interpret the subsequent output. ASAP is written in FORTRAN 77 for batch execution and has been implemented on an IBM PC compatible computer operating under MS-DOS. The ASAP package requires a math coprocessor and a minimum of 256K RAM. This program was last

  17. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    Science.gov (United States)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  18. The image of psychology programs: the value of the instrumental-symbolic framework.

    Science.gov (United States)

    Van Hoye, Greet; Lievens, Filip; De Soete, Britt; Libbrecht, Nele; Schollaert, Eveline; Baligant, Dimphna

    2014-01-01

    As competition for funding and students intensifies, it becomes increasingly important for psychology programs to have an image that is attractive and makes them stand out from other programs. The current study uses the instrumental-symbolic framework from the marketing domain to determine the image of different master's programs in psychology and examines how these image dimensions relate to student attraction and competitor differentiation. The samples consist of both potential students (N = 114) and current students (N = 68) of three psychology programs at a Belgian university: industrial and organizational psychology, clinical psychology, and experimental psychology. The results demonstrate that both instrumental attributes (e.g., interpersonal activities) and symbolic trait inferences (e.g., sincerity) are key components of the image of psychology programs and predict attractiveness as well as differentiation. In addition, symbolic image dimensions seem more important for current students of psychology programs than for potential students.

  19. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    International Nuclear Information System (INIS)

    Hartwig, Zachary S.

    2016-01-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  20. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Hartwig, Zachary S., E-mail: hartwig@mit.edu

    2016-04-11

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  1. TomoPy: a framework for the analysis of synchrotron tomographic data

    International Nuclear Information System (INIS)

    Gürsoy, Doǧa; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris

    2014-01-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing

  2. TomoPy: a framework for the analysis of synchrotron tomographic data

    Energy Technology Data Exchange (ETDEWEB)

    Gürsoy, Doǧa, E-mail: dgursoy@aps.anl.gov; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris [Advanced Photon Source, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439-4837 (United States)

    2014-08-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing.

  3. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    International Nuclear Information System (INIS)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.; Wecksung, M.J.; Willcutt, G.J.E. Jr.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework

  4. Supporting Intrapersonal Development in Substance Use Disorder Programs: A Conceptual Framework for Client Assessment.

    Science.gov (United States)

    Turpin, Aaron; Shier, Micheal L

    2017-01-01

    Improvements to intrapersonal development of clients involved with substance use disorder treatment programs has widely been recognized as contributing to the intended goal of reducing substance misuse behaviors. This study sought to identify a broad framework of primary outcomes related to the intrapersonal development of clients in treatment for substance misuse. Using qualitative research methods, individual interviews were conducted with program participants (n = 41) at three treatment programs to identify the ways in which respondents experienced intrapersonal development through participation in treatment. The findings support the development of a conceptual model that captures the importance and manifestation of achieving improvements in the following outcomes: self-awareness, coping ability, self-worth, outlook, and self-determination. The findings provide a conceptual framework for client assessment that captures a broad range of the important intrapersonal development factors utilized as indicators for client development and recovery that should be measured in tandem during assessment.

  5. Using the Five Senses of Success framework to understand the experiences of midwifery students enroled in an undergraduate degree program.

    Science.gov (United States)

    Sidebotham, M; Fenwick, J; Carter, A; Gamble, J

    2015-01-01

    developing a student's sense of capability, purpose, resourcefulness, identity and connectedness (five-senses of success) are key factors that may be important in predicting student satisfaction and progression within their university program. the study aimed to examine the expectations and experiences of second and third year midwifery students enroled in a Bachelor of Midwifery program and identify barriers and enablers to success. a descriptive exploratory qualitative design was used. Fifty-six students enroled in either year 2 or 3 of the Bachelor of Midwifery program in SE Queensland participated in an anonymous survey using open-ended questions. In addition, 16 students participated in two year-level focus groups. Template analysis, using the Five Senses Framework, was used to analyse the data set. early exposure to 'hands on' clinical midwifery practice as well as continuity of care experiences provided students with an opportunity to link theory to practice and increased their perception of capability as they transitioned through the program. Students' sense of identity, purpose, resourcefulness, and capability was strongly influenced by the programs embedded meta-values, including a 'woman centred' approach. In addition, a student's ability to form strong positive relationships with women, peers, lecturers and supportive clinicians was central to developing connections and ultimately a sense of success. A sense of connection not only fostered an ongoing belief that challenges could be overcome but that students themselves could initiate or influence change. the five senses framework provided a useful lens through which to analyse the student experience. Key factors to student satisfaction and retention within a Bachelor of Midwifery program include: a clearly articulated midwifery philosophy, strategies to promote student connectedness including the use of social media, and further development of clinicians' skills in preceptorship, clinical teaching and

  6. The chronic care model versus disease management programs: a transaction cost analysis approach.

    Science.gov (United States)

    Leeman, Jennifer; Mark, Barbara

    2006-01-01

    The present article applies transaction cost analysis as a framework for better understanding health plans' decisions to improve chronic illness management by using disease management programs versus redesigning care within physician practices.

  7. A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Ebenezer Out-Nyarko

    2009-11-01

    Full Text Available Using Hidden Markov Models (HMMs as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.

  8. FIND--a unified framework for neural data analysis.

    Science.gov (United States)

    Meier, Ralph; Egert, Ulrich; Aertsen, Ad; Nawrot, Martin P

    2008-10-01

    The complexity of neurophysiology data has increased tremendously over the last years, especially due to the widespread availability of multi-channel recording techniques. With adequate computing power the current limit for computational neuroscience is the effort and time it takes for scientists to translate their ideas into working code. Advanced analysis methods are complex and often lack reproducibility on the basis of published descriptions. To overcome this limitation we develop FIND (Finding Information in Neural Data) as a platform-independent, open source framework for the analysis of neuronal activity data based on Matlab (Mathworks). Here, we outline the structure of the FIND framework and describe its functionality, our measures of quality control, and the policies for developers and users. Within FIND we have developed a unified data import from various proprietary formats, simplifying standardized interfacing with tools for analysis and simulation. The toolbox FIND covers a steadily increasing number of tools. These analysis tools address various types of neural activity data, including discrete series of spike events, continuous time series and imaging data. Additionally, the toolbox provides solutions for the simulation of parallel stochastic point processes to model multi-channel spiking activity. We illustrate two examples of complex analyses with FIND tools: First, we present a time-resolved characterization of the spiking irregularity in an in vivo extracellular recording from a mushroom-body extrinsic neuron in the honeybee during odor stimulation. Second, we describe layer specific input dynamics in the rat primary visual cortex in vivo in response to visual flash stimulation on the basis of multi-channel spiking activity.

  9. A program for activation analysis data processing

    International Nuclear Information System (INIS)

    Janczyszyn, J.; Loska, L.; Taczanowski, S.

    1978-01-01

    An ALGOL program for activation analysis data handling is presented. The program may be used either for single channel spectrometry data or for multichannel spectrometry. The calculation of instrumental error and of analysis standard deviation is carried out. The outliers are tested, and the regression line diagram with the related observations are plotted by the program. (author)

  10. Mississippi Curriculum Framework for Welding (Program CIP: 48.0508--Welder/Welding Technologist). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for welding I and II. Presented first are a program description and course…

  11. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Science.gov (United States)

    Convertino, Matteo; Valverde, L James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  12. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Directory of Open Access Journals (Sweden)

    Matteo Convertino

    Full Text Available Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the

  13. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management

    Science.gov (United States)

    Convertino, Matteo; Valverde, L. James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  14. Can programming frameworks bring smartphones into the mainstream of psychological science?

    Directory of Open Access Journals (Sweden)

    Lukasz Piwek

    2016-08-01

    Full Text Available Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key barriers that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in light of ResearchKit and other recent methodological developments. We conclude that while these programming frameworks are certainly a step in the right direction it remains challenging to create usable research-orientated applications with current frameworks. Smartphones may only become an asset for psychology and social science as a whole when development software that is both easy to use, secure, and becomes freely available.

  15. Can Programming Frameworks Bring Smartphones into the Mainstream of Psychological Science?

    Science.gov (United States)

    Piwek, Lukasz; Ellis, David A

    2016-01-01

    Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key issues that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in light of ResearchKit and other recent methodological developments. We conclude that while these programming frameworks are certainly a step in the right direction it remains challenging to create usable research-orientated applications with current frameworks. Smartphones may only become an asset for psychology and social science as a whole when development software that is both easy to use and secure becomes freely available.

  16. Academic Libraries and Quality: An Analysis and Evaluation Framework

    Science.gov (United States)

    Atkinson, Jeremy

    2017-01-01

    The paper proposes and describes a framework for academic library quality to be used by new and more experienced library practitioners and by others involved in considering the quality of academic libraries' services and provision. The framework consists of eight themes and a number of questions to examine within each theme. The framework was…

  17. Suicide Risk Assessment Training for Psychology Doctoral Programs: Core Competencies and a Framework for Training

    OpenAIRE

    Cramer, Robert J.; Johnson, Shara M.; McLaughlin, Jennifer; Rausch, Emilie M.; Conroy, Mary Alice

    2013-01-01

    Clinical and counseling psychology programs currently lack adequate evidence-based competency goals and training in suicide risk assessment. To begin to address this problem, this article proposes core competencies and an integrated training framework that can form the basis for training and research in this area. First, we evaluate the extent to which current training is effective in preparing trainees for suicide risk assessment. Within this discussion, sample and methodological issues are ...

  18. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  19. Pembuatan Kakas Pendeteksi Unused Method pada Kode Program PHP dengan Framework CodeIgniter Menggunakan Call Graph

    Directory of Open Access Journals (Sweden)

    Divi Galih Prasetyo Putri

    2014-03-01

    Full Text Available Proses evolusi dan perawatan dari sebuah sistem merupakan proses yang sangat penting dalam rekayasa perangkat lunak tidak terkecuali pada aplikasi web. Pada proses ini kebanyakan pengembang tidak lagi berpatokan pada rancangan sistem. Hal ini menyebabkan munculnya unused method. Bagian-bagian program ini tidak lagi terpakai namun masih berada dalam sistem. Keadaan ini meningkatkan kompleksitas dan mengurangi tingkat understandability sistem. Guna mendeteksi adanya unused method pada progam diperlukan teknik untuk melakukan code analysis. Teknik static analysis yang digunakan memanfaatkan call graph yang dibangun dari kode program untuk mengetahui adanya unused method. Call graph dibangun berdasarkan pemanggilan antar method. Aplikasi ini mendeteksi unused method pada kode program PHP yang dibangun menggunakan framework CodeIgniter. Kode program sebagai inputan diurai kedalam bentuk Abstract Syntax Tree (AST yang kemudian dimanfaatkan untuk melakukan analisis terhadap kode program. Proses analisis tersebut kemudian menghasilkan sebuah call graph. Dari call graph yang dihasilkan dapat dideteksi method-method mana saja yang tidak berhasil ditelusuri dan tergolong kedalam unused method. Kakas telah diuji coba pada 5 aplikasi PHP dengan hasil  rata-rata nilai presisi sistem sebesar 0.749 dan recall sebesar 1.

  20. Learner Analysis Framework for Globalized E-Learning: A Case Study

    Directory of Open Access Journals (Sweden)

    Mamta Saxena

    2011-06-01

    Full Text Available The shift to technology-mediated modes of instructional delivery and increased global connectivity has led to a rise in globalized e-learning programs. Educational institutions face multiple challenges as they seek to design effective, engaging, and culturally competent instruction for an increasingly diverse learner population. The purpose of this study was to explore strategies for expanding learner analysis within the instructional design process to better address cultural influences on learning. A case study approach leveraged the experience of practicing instructional designers to build a framework for culturally competent learner analysis.The study discussed the related challenges and recommended strategies to improve the effectiveness of cross-cultural learner analysis. Based on the findings, a framework for conducting cross-cultural learner analysis to guide the cultural analysis of diverse learners was proposed. The study identified the most critical factors in improving cross-cultural learner analysis as the judicious use of existing research on cross-cultural theories and joint deliberation on the part of all the participants from the management to the learners. Several strategies for guiding and improving the cultural inquiry process were summarized. Barriers and solutions for the requirements are also discussed.

  1. Water Quality Analysis Simulation Program (WASP)

    Science.gov (United States)

    The Water Quality Analysis Simulation Program (WASP) model helps users interpret and predict water quality responses to natural phenomena and manmade pollution for various pollution management decisions.

  2. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    Science.gov (United States)

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  3. A framework for automatic heart sound analysis without segmentation

    Directory of Open Access Journals (Sweden)

    Tungpimolrut Kanokvate

    2011-02-01

    Full Text Available Abstract Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS. The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR, and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  4. Program Analysis as Model Checking

    DEFF Research Database (Denmark)

    Olesen, Mads Chr.

    Software programs are proliferating throughout modern life, to a point where even the simplest appliances such as lightbulbs contain software, in addition to the software embedded in cars and airplanes. The correct functioning of these programs is therefore of the utmost importance, for the quality...

  5. Mississippi Curriculum Framework for General Drafting (Program CIP: 48.0101--Drafting, General). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for two secondary-level courses in drafting: drafting I and II. Presented…

  6. Evolutionary squeaky wheel optimization: a new framework for analysis.

    Science.gov (United States)

    Li, Jingpeng; Parkes, Andrew J; Burke, Edmund K

    2011-01-01

    Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.

  7. A computer program for activation analysis

    International Nuclear Information System (INIS)

    Rantanen, J.; Rosenberg, R.J.

    1983-01-01

    A computer program for calculating the results of activation analysis is described. The program comprises two gamma spectrum analysis programs, STOAV and SAMPO and one program for calculating elemental concentrations, KVANT. STOAV is based on a simple summation of channels and SAMPO is based on fitting of mathematical functions. The programs are tested by analyzing the IAEA G-1 test spectra. In the determination of peak location SAMPO is somewhat better than STOAV and in the determination of peak area SAMPO is more than twice as accurate as STOAV. On the other hand, SAMPO is three times as expensive as STOAV with the use of a Cyber 170 computer. (author)

  8. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  9. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  10. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  11. Short Run Profit Maximization in a Convex Analysis Framework

    Directory of Open Access Journals (Sweden)

    Ilko Vrankic

    2017-03-01

    Full Text Available In this article we analyse the short run profit maximization problem in a convex analysis framework. The goal is to apply the results of convex analysis due to unique structure of microeconomic phenomena on the known short run profit maximization problem where the results from convex analysis are deductively applied. In the primal optimization model the technology in the short run is represented by the short run production function and the normalized profit function, which expresses profit in the output units, is derived. In this approach the choice variable is the labour quantity. Alternatively, technology is represented by the real variable cost function, where costs are expressed in the labour units, and the normalized profit function is derived, this time expressing profit in the labour units. The choice variable in this approach is the quantity of production. The emphasis in these two perspectives of the primal approach is given to the first order necessary conditions of both models which are the consequence of enveloping the closed convex set describing technology with its tangents. The dual model includes starting from the normalized profit function and recovering the production function, and alternatively the real variable cost function. In the first perspective of the dual approach the choice variable is the real wage, and in the second it is the real product price expressed in the labour units. It is shown that the change of variables into parameters and parameters into variables leads to both optimization models which give the same system of labour demand and product supply functions and their inverses. By deductively applying the results of convex analysis the comparative statics results are derived describing the firm's behaviour in the short run.

  12. SIDEKICK: Genomic data driven analysis and decision-making framework

    Directory of Open Access Journals (Sweden)

    Yoon Kihoon

    2010-12-01

    Full Text Available Abstract Background Scientists striving to unlock mysteries within complex biological systems face myriad barriers in effectively integrating available information to enhance their understanding. While experimental techniques and available data sources are rapidly evolving, useful information is dispersed across a variety of sources, and sources of the same information often do not use the same format or nomenclature. To harness these expanding resources, scientists need tools that bridge nomenclature differences and allow them to integrate, organize, and evaluate the quality of information without extensive computation. Results Sidekick, a genomic data driven analysis and decision making framework, is a web-based tool that provides a user-friendly intuitive solution to the problem of information inaccessibility. Sidekick enables scientists without training in computation and data management to pursue answers to research questions like "What are the mechanisms for disease X" or "Does the set of genes associated with disease X also influence other diseases." Sidekick enables the process of combining heterogeneous data, finding and maintaining the most up-to-date data, evaluating data sources, quantifying confidence in results based on evidence, and managing the multi-step research tasks needed to answer these questions. We demonstrate Sidekick's effectiveness by showing how to accomplish a complex published analysis in a fraction of the original time with no computational effort using Sidekick. Conclusions Sidekick is an easy-to-use web-based tool that organizes and facilitates complex genomic research, allowing scientists to explore genomic relationships and formulate hypotheses without computational effort. Possible analysis steps include gene list discovery, gene-pair list discovery, various enrichments for both types of lists, and convenient list manipulation. Further, Sidekick's ability to characterize pairs of genes offers new ways to

  13. Economic impacts of climate change in Australia: framework and analysis

    International Nuclear Information System (INIS)

    Ford, Melanie

    2007-01-01

    Full text: There is growing interest in understanding the potential impacts of climate change in Australia, and especially the economic impacts of 'inaction'. In this study, a preliminary analysis of the possible economic impacts of future climate change in Australia is undertaken using ABARE's general equilibrium model of the global economy, GTEM. In order to understand the potential economy-wide economic impacts, the broad climatic trends that Australia is likely to experience over the next several decades are canvassed and the potential economic and non-economic impacts on key risk areas, such as water resources, agriculture and forests, health, industry and human settlements and the ecosystems, are identified. A more detailed analysis of the economic impacts of climate change are undertaken by developing two case studies. In the first case study, the economic impact of climate change and reduced water availability on the agricultural sector is assessed in the Murray-Darling Basin. In the second case study, the sectoral economic impacts on the Australian resources sector of a projected decline in global economic activity due to climate change is analysed. The key areas of required development to more fully understand the economy-wide and sectoral impacts of climate change are also discussed including issues associated with estimating both non-market and market impacts. Finally, an analytical framework for undertaking integrated assessment of climate change impacts domestically and globally is developed

  14. MetaJC++: A flexible and automatic program transformation technique using meta framework

    Science.gov (United States)

    Beevi, Nadera S.; Reghu, M.; Chitraprasad, D.; Vinodchandra, S. S.

    2014-09-01

    Compiler is a tool to translate abstract code containing natural language terms to machine code. Meta compilers are available to compile more than one languages. We have developed a meta framework intends to combine two dissimilar programming languages, namely C++ and Java to provide a flexible object oriented programming platform for the user. Suitable constructs from both the languages have been combined, thereby forming a new and stronger Meta-Language. The framework is developed using the compiler writing tools, Flex and Yacc to design the front end of the compiler. The lexer and parser have been developed to accommodate the complete keyword set and syntax set of both the languages. Two intermediate representations have been used in between the translation of the source program to machine code. Abstract Syntax Tree has been used as a high level intermediate representation that preserves the hierarchical properties of the source program. A new machine-independent stack-based byte-code has also been devised to act as a low level intermediate representation. The byte-code is essentially organised into an output class file that can be used to produce an interpreted output. The results especially in the spheres of providing C++ concepts in Java have given an insight regarding the potential strong features of the resultant meta-language.

  15. A benchmarking program to reduce red blood cell outdating: implementation, evaluation, and a conceptual framework.

    Science.gov (United States)

    Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M

    2015-07-01

    Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.

  16. A hybrid Constraint Programming/Mixed Integer Programming framework for the preventive signaling maintenance crew scheduling problem

    DEFF Research Database (Denmark)

    Pour, Shahrzad M.; Drake, John H.; Ejlertsen, Lena Secher

    2017-01-01

    A railway signaling system is a complex and interdependent system which should ensure the safe operation of trains. We introduce and address a mixed integer optimisation model for the preventive signal maintenance crew scheduling problem in the Danish railway system. The problem contains many...... to feed as ‘warm start’ solutions to a Mixed Integer Programming (MIP) solver for further optimisation. We apply the CP/MIP framework to a section of the Danish rail network and benchmark our results against both direct application of a MIP solver and modelling the problem as a Constraint Optimisation...

  17. RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,

    Science.gov (United States)

    This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)

  18. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...

  19. Status of CHAP: composite HTGR analysis program

    International Nuclear Information System (INIS)

    Secker, P.A.; Gilbert, J.S.

    1975-12-01

    Development of an HTGR accident simulation program is in progress for the prediction of the overall HTGR plant transient response to various initiating events. The status of the digital computer program named CHAP (Composite HTGR Analysis Program) as of June 30, 1975, is given. The philosophy, structure, and capabilities of the CHAP code are discussed. Mathematical descriptions are given for those HTGR components that have been modeled. Component model validation and evaluation using auxiliary analysis codes are also discussed

  20. The Impact of a "Framework"-Aligned Science Professional Development Program on Literacy and Mathematics Achievement of K-3 Students

    Science.gov (United States)

    Paprzycki, Peter; Tuttle, Nicole; Czerniak, Charlene M.; Molitor, Scott; Kadervaek, Joan; Mendenhall, Robert

    2017-01-01

    This study investigates the effect of a Framework-aligned professional development program at the PreK-3 level. The NSF funded program integrated science with literacy and mathematics learning and provided teacher professional development, along with materials and programming for parents to encourage science investigations and discourse around…

  1. Mississippi Curriculum Framework for Medical Radiologic Technology (Radiography) (CIP: 51.0907--Medical Radiologic Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the radiologic technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the program,…

  2. A Framework for Analysis of Music Similarity Measures

    DEFF Research Database (Denmark)

    Jensen, Jesper Højvang; Christensen, Mads G.; Jensen, Søren Holdt

    2007-01-01

    To analyze specific properties of music similarity measures that the commonly used genre classification evaluation procedure does not reveal, we introduce a MIDI based test framework for music similarity measures. We introduce the framework by example and thus outline an experiment to analyze the...

  3. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  4. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent. Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  5. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent.    Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  6. Inclusiveness program - a SWOT analysis

    Science.gov (United States)

    Dósa, M.; Szegő, K.

    2017-09-01

    The Inclusiveness Program was created with the aim to integrate currently under-represented countries into the mainstream of European planetary research. Main stages of the working plan include setting up a database containing all the research institutes and universities where astronomical or geophysical research is carried out. It is necessary to identify their problems and needs. Challenging part of the project is to find exact means that help their work in a sustainable way. Strengths, weaknesses, opportunities and threats of the program were identified based on feedback from the inclusiveness community. Our conclusions, further suggestions are presented.

  7. Towards an intelligent framework for multimodal affective data analysis.

    Science.gov (United States)

    Poria, Soujanya; Cambria, Erik; Hussain, Amir; Huang, Guang-Bin

    2015-03-01

    An increasingly large amount of multimodal content is posted on social media websites such as YouTube and Facebook everyday. In order to cope with the growth of such so much multimodal data, there is an urgent need to develop an intelligent multi-modal analysis framework that can effectively extract information from multiple modalities. In this paper, we propose a novel multimodal information extraction agent, which infers and aggregates the semantic and affective information associated with user-generated multimodal data in contexts such as e-learning, e-health, automatic video content tagging and human-computer interaction. In particular, the developed intelligent agent adopts an ensemble feature extraction approach by exploiting the joint use of tri-modal (text, audio and video) features to enhance the multimodal information extraction process. In preliminary experiments using the eNTERFACE dataset, our proposed multi-modal system is shown to achieve an accuracy of 87.95%, outperforming the best state-of-the-art system by more than 10%, or in relative terms, a 56% reduction in error rate. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Layers of protection analysis in the framework of possibility theory.

    Science.gov (United States)

    Ouazraoui, N; Nait-Said, R; Bourareche, M; Sellami, I

    2013-11-15

    An important issue faced by risk analysts is how to deal with uncertainties associated with accident scenarios. In industry, one often uses single values derived from historical data or literature to estimate events probability or their frequency. However, both dynamic environments of systems and the need to consider rare component failures may make unrealistic this kind of data. In this paper, uncertainty encountered in Layers Of Protection Analysis (LOPA) is considered in the framework of possibility theory. Data provided by reliability databases and/or experts judgments are represented by fuzzy quantities (possibilities). The fuzzy outcome frequency is calculated by extended multiplication using α-cuts method. The fuzzy outcome is compared to a scenario risk tolerance criteria and the required reduction is obtained by resolving a possibilistic decision-making problem under necessity constraint. In order to validate the proposed model, a case study concerning the protection layers of an operational heater is carried out. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Sustainability of ARV provision in developing countries: challenging a framework based on program history

    Directory of Open Access Journals (Sweden)

    Thiago Botelho Azeredo

    Full Text Available Abstract The provision of ARVs is central to HIV/AIDS programs, because of its impact on the course of the disease and on quality of life. Although first-line treatments costs have declined, treatment-associated expenses are steeper each year. Sustainability is therefore an important variable for the success of treatment programs. A conceptual framework on sustainability of ARV provision was developed, followed by data collection instruments. The pilot study was undertaken in Brazil. Bolivia, Peru and Mozambique, were visited. Key informants were identified and interviewed. Investigation of sustainability related to ARV provision involved implementation and routinization events of provision schemes. Evidence of greater sustainability potential was observed in Peru, where provision is implemented and routinized by the National HIV/AIDS program and expenditures met by the government. In Mozambique, provision is dependent on donations and external aid, but the country displays a great effort to incorporate ARV provision and care in routine healthcare activities. Bolivia, in addition to external dependence on financing and management of drug supply, presents problems regarding implementation and routinization. The conceptual framework was useful in recognizing events that influence sustainable ARV provision in these countries.

  10. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  11. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  12. High-Fidelity Aerothermal Engineering Analysis for Planetary Probes Using DOTNET Framework and OLAP Cubes Database

    Directory of Open Access Journals (Sweden)

    Prabhakar Subrahmanyam

    2009-01-01

    Full Text Available This publication presents the architecture integration and implementation of various modules in Sparta framework. Sparta is a trajectory engine that is hooked to an Online Analytical Processing (OLAP database for Multi-dimensional analysis capability. OLAP is an Online Analytical Processing database that has a comprehensive list of atmospheric entry probes and their vehicle dimensions, trajectory data, aero-thermal data and material properties like Carbon, Silicon and Carbon-Phenolic based Ablators. An approach is presented for dynamic TPS design. OLAP has the capability to run in one simulation several different trajectory conditions and the output is stored back into the database and can be queried for appropriate trajectory type. An OLAP simulation can be setup by spawning individual threads to run for three types of trajectory: Nominal, Undershoot and Overshoot trajectory. Sparta graphical user interface provides capabilities to choose from a list of flight vehicles or enter trajectory and geometry information of a vehicle in design. DOTNET framework acts as a middleware layer between the trajectory engine and the user interface and also between the web user interface and the OLAP database. Trajectory output can be obtained in TecPlot format, Excel output or in a KML (Keyhole Markup Language format. Framework employs an API (application programming interface to convert trajectory data into a formatted KML file that is used by Google Earth for simulating Earth-entry fly-by visualizations.

  13. iOS Game Development using SpriteKit Framework with Swift Programming Language

    OpenAIRE

    Gurung, Lal

    2016-01-01

    iOS is a mobile operating system for Apple manufactured phones and tablets. Mobile Gaming Industries are growing very fast, and compatibility with iOS is becoming very popular among game developers. The aim of this Bachelor’s thesis was to find the best available game development tools for iOS platform. The 2D game named Lapland was developed using Apple’s own native framework, SpriteKit. The game was written with the SpriteKit programming language. The combination of SpriteKit and Swift...

  14. Evaluation of capacity-building program of district health managers in India: a contextualized theoretical framework.

    Science.gov (United States)

    Prashanth, N S; Marchal, Bruno; Kegels, Guy; Criel, Bart

    2014-01-01

    Performance of local health services managers at district level is crucial to ensure that health services are of good quality and cater to the health needs of the population in the area. In many low- and middle-income countries, health services managers are poorly equipped with public health management capacities needed for planning and managing their local health system. In the south Indian Tumkur district, a consortium of five non-governmental organizations partnered with the state government to organize a capacity-building program for health managers. The program consisted of a mix of periodic contact classes, mentoring and assignments and was spread over 30 months. In this paper, we develop a theoretical framework in the form of a refined program theory to understand how such a capacity-building program could bring about organizational change. A well-formulated program theory enables an understanding of how interventions could bring about improvements and an evaluation of the intervention. In the refined program theory of the intervention, we identified various factors at individual, institutional, and environmental levels that could interact with the hypothesized mechanisms of organizational change, such as staff's perceived self-efficacy and commitment to their organizations. Based on this program theory, we formulated context-mechanism-outcome configurations that can be used to evaluate the intervention and, more specifically, to understand what worked, for whom and under what conditions. We discuss the application of program theory development in conducting a realist evaluation. Realist evaluation embraces principles of systems thinking by providing a method for understanding how elements of the system interact with one another in producing a given outcome.

  15. Analysis of higher education policy frameworks for open and distance education in Pakistan.

    Science.gov (United States)

    Ellahi, Abida; Zaka, Bilal

    2015-04-01

    The constant rise in demand for higher education has become the biggest challenge for educational planners. This high demand has paved a way for distance education across the globe. This article innovatively analyzes the policy documentation of a major distance education initiative in Pakistan for validity that will identify the utility of policy linkages. The study adopted a qualitative research design that consisted of two steps. In the first step, a content analysis of distance learning policy framework was made. For this purpose, two documents were accessed titled "Framework for Launching Distance Learning Programs in HEIs of Pakistan" and "Guideline on Quality of Distance Education for External Students at the HEIs of Pakistan." In the second step, the policy guidelines mentioned in these two documents were evaluated at two levels. At the first level, the overall policy documents were assessed against a criterion proposed by Cheung, Mirzaei, and Leeder. At the second level, the proposed program of distance learning was assessed against a criterion set by Gellman-Danley and Fetzner and Berge. The distance education program initiative in Pakistan is of promising nature which needs to be assessed regularly. This study has made an initial attempt to assess the policy document against a criterion identified from literature. The analysis shows that the current policy documents do offer some strengths at this initial level, however, they cannot be considered a comprehensive policy guide. The inclusion or correction of missing or vague areas identified in this study would make this policy guideline document a treasured tool for Higher Education Commission (HEC). For distance education policy makers, this distance education policy framework model recognizes several fundamental areas with which they should be concerned. The findings of this study in the light of two different policy framework measures highlight certain opportunities that can help strengthening the

  16. OpenMDAO: Framework for Flexible Multidisciplinary Design, Analysis and Optimization Methods

    Science.gov (United States)

    Heath, Christopher M.; Gray, Justin S.

    2012-01-01

    The OpenMDAO project is underway at NASA to develop a framework which simplifies the implementation of state-of-the-art tools and methods for multidisciplinary design, analysis and optimization. Foremost, OpenMDAO has been designed to handle variable problem formulations, encourage reconfigurability, and promote model reuse. This work demonstrates the concept of iteration hierarchies in OpenMDAO to achieve a flexible environment for supporting advanced optimization methods which include adaptive sampling and surrogate modeling techniques. In this effort, two efficient global optimization methods were applied to solve a constrained, single-objective and constrained, multiobjective version of a joint aircraft/engine sizing problem. The aircraft model, NASA's nextgeneration advanced single-aisle civil transport, is being studied as part of the Subsonic Fixed Wing project to help meet simultaneous program goals for reduced fuel burn, emissions, and noise. This analysis serves as a realistic test problem to demonstrate the flexibility and reconfigurability offered by OpenMDAO.

  17. Alternative Frameworks for Improving Government Organizational Performance: A Comparative Analysis

    National Research Council Canada - National Science Library

    Simon, Cary

    1997-01-01

    .... Six major frameworks emerging in the U.S. since 1980, applicable to the public sector, and designed to enhance organizational change toward improved performance are reviewed and analyzed: Total Quality; 'Excellence...

  18. A mathematical programming framework for early stage design of wastewater treatment plants

    DEFF Research Database (Denmark)

    Bozkurt, Hande; Quaglia, Alberto; Gernaey, Krist

    2015-01-01

    The increasing number of alternative wastewater treatment technologies and stricter effluent requirements make the optimal treatment process selection for wastewater treatment plant design a complicated problem. This task, defined as wastewater treatment process synthesis, is currently based on e...... the design problem is formulated as a Mixed Integer (Non)linear Programming problem e MI(N)LP e and solved. A case study is formulated and solved to highlight the application of the framework. © 2014 Elsevier Ltd. All rights reserved....... on expert decisions and previous experiences. This paper proposes a new approach based on mathematical programming to manage the complexity of the problem. The approach generates/identifies novel and optimal wastewater treatment process selection, and the interconnection between unit operations to create...

  19. The PUMA test program and data analysis

    International Nuclear Information System (INIS)

    Han, J.T.; Morrison, D.L.

    1997-01-01

    The PUMA test program is sponsored by the U.S. Nuclear Regulatory Commission to provide data that are relevant to various Boiling Water Reactor phenomena. The author briefly describes the PUMA test program and facility, presents the objective of the program, provides data analysis for a large-break loss-of-coolant accident test, and compares the data with a RELAP5/MOD 3.1.2 calculation

  20. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  1. A Conceptual Framework over Contextual Analysis of Concept Learning within Human-Machine Interplays

    DEFF Research Database (Denmark)

    Badie, Farshad

    2016-01-01

    This research provides a contextual description concerning existential and structural analysis of ‘Relations’ between human beings and machines. Subsequently, it will focus on conceptual and epistemological analysis of (i) my own semantics-based framework [for human meaning construction] and of (ii......) a well-structured machine concept learning framework. Accordingly, I will, semantically and epistemologically, focus on linking those two frameworks for logical analysis of concept learning in the context of human-machine interrelationships. It will be demonstrated that the proposed framework provides...

  2. Probabilistic Resource Analysis by Program Transformation

    DEFF Research Database (Denmark)

    Kirkeby, Maja Hanne; Rosendahl, Mads

    2016-01-01

    The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...

  3. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  4. A decision framework for coordinating bioterrorism planning: lessons from the BioNet program.

    Science.gov (United States)

    Manley, Dawn K; Bravata, Dena M

    2009-01-01

    Effective disaster preparedness requires coordination across multiple organizations. This article describes a detailed framework developed through the BioNet program to facilitate coordination of bioterrorism preparedness planning among military and civilian decision makers. The authors and colleagues conducted a series of semistructured interviews with civilian and military decision makers from public health, emergency management, hazardous material response, law enforcement, and military health in the San Diego area. Decision makers used a software tool that simulated a hypothetical anthrax attack, which allowed them to assess the effects of a variety of response actions (eg, issuing warnings to the public, establishing prophylaxis distribution centers) on performance metrics. From these interviews, the authors characterized the information sources, technologies, plans, and communication channels that would be used for bioterrorism planning and responses. The authors used influence diagram notation to describe the key bioterrorism response decisions, the probabilistic factors affecting these decisions, and the response outcomes. The authors present an overview of the response framework and provide a detailed assessment of two key phases of the decision-making process: (1) pre-event planning and investment and (2) incident characterization and initial responsive measures. The framework enables planners to articulate current conditions; identify gaps in existing policies, technologies, information resources, and relationships with other response organizations; and explore the implications of potential system enhancements. Use of this framework could help decision makers execute a locally coordinated response by identifying the critical cues of a potential bioterrorism event, the information needed to make effective response decisions, and the potential effects of various decision alternatives.

  5. A Practical Framework for Evaluating Health Services Management Educational Program: The Application of The Mixed-Method Sequential Explanatory Design

    Directory of Open Access Journals (Sweden)

    Bazrafshan Azam

    2015-07-01

    Full Text Available Introduction:Health services managers are responsible for improving the efficiency and quality in delivering healthcare services. In this regard, Health Services Management (HSM programs have been widely established to provide health providers with skilled, professional managers to address those needs. It is therefore important to ascertain the quality of these programs. The purpose of this study was to synthesize and develop a framework to evaluate the quality of the Health Services Management (HSM program at Kerman University of Medical Sciences. Methods: This study followed a mixed-method sequential explanatory approach in which data were collected through a CIPP survey and semi-structured interviews. In phase 1, participants included 10 faculty members, 64 students and 90 alumni. In phase 2, in-depth semi-structured interviews and purposeful sampling were conducted with 27 participants to better understand their perceptions of the HSM program. All interviews were audio-taped and transcribed verbatim. NVivo N8 was used to analyze the qualitative data and extract the themes. Results: The data analysis revealed both positive and negative attitudes toward the HSM program. According to the CIPP survey, program objectives (74%, curriculum content (59.5% and graduate skills (79% were the major sources of dissatisfaction. However, most respondents (n=48 reported that the classes are well equipped and learning resources are well prepared (n=41. Most respondents (n=41 reported that the students are actively involved in classroom activities. The majority of respondents (n=43 pointed out that the instructors implemented appropriate teaching strategies. Qualitative analysis of interviews revealed that a regular community needs assessment, content revision and directing attention to graduate skills and expertise are the key solutions to improve the program’s quality.Conclusion: This study revealed to what extent the HSM program objectives is being

  6. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  7. Decision Vulnerability Analysis (DVA) Program

    Science.gov (United States)

    2014-05-01

    31 14 Graphical Representation of the Summary Judgments of the Effectiveness, Vulnerability, and Understanding of the Subsystems’ as Judged by...posed several challenges. Numerous organizational typologies have been suggested over the years ( Robbins , 1994), and these typologies are often based...structure and functioning from a typology perspective ( Robbins , 1994), excerpts from a task analysis that described how the analysts currently performed

  8. Policy analysis and advocacy in nursing education: the Nursing Education Council of British Columbia framework.

    Science.gov (United States)

    Duncan, Susan M; Thorne, Sally; Van Neste-Kenny, Jocelyne; Tate, Betty

    2012-05-01

    Academic nursing leaders play a crucial role in the policy context for nursing education. Effectiveness in this role requires that they work together in presenting nursing education issues from a position of strength, informed by a critical analysis of policy pertaining to the delivery of quality nursing education and scholarship. We describe a collective process of dialog and critical analysis whereby nurse leaders in one Canadian province addressed pressing policy issues facing governments, nursing programs, faculty, and students. Consensus among academic nurse leaders, formalized through the development of a policy action framework, has enabled us to take a stand, at times highly contested, in the politicized arena of the nursing shortage. We present the components of a policy action framework for nursing education and share examples of how we have used a critical approach to analyze and frame policy issues in nursing education for inclusion on policy agendas. We believe our work has influenced provincial and national thinking about policy in nursing education is the foundation of our conclusion that political presence and shared strategy among academic nursing leaders is undeniably critical in the global context of nursing today. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  9. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...... control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced...

  10. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  11. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ``Energy Efficiency, Developing Countries, and Eastern Europe,`` part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program`s researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  12. Polyglot programming in applications used for genetic data analysis.

    Science.gov (United States)

    Nowak, Robert M

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.

  13. Building an Evaluation Framework for a Competency-Based Graduate Program at the University Of Southern Mississippi

    Science.gov (United States)

    Gaudet, Cyndi H.; Annulis, Heather M.; Kmiec, John J., Jr.

    2008-01-01

    This article describes an ongoing project to build a comprehensive evaluation framework for the competency-based Master of Science in Workforce Training and Development (MSWTD) program at The University of Southern Mississippi (USM). First, it discusses some trends and issues in evaluating the performance of higher education programs in the United…

  14. Mississippi Curriculum Framework for Diesel Equipment Technology (CIP: 47.0605--Diesel Engine Mechanic & Repairer). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the diesel equipment technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…

  15. Solving stochastic programs with integer recourse by enumeration : a framework using Gröbner basis reductions

    NARCIS (Netherlands)

    Schultz, R.; Stougie, L.; Vlerk, van der M.H.

    1998-01-01

    In this paper we present a framework for solving stochastic programs with complete integer recourse and discretely distributed right-hand side vector, using Gröbner basis methods from computational algebra to solve the numerous second-stage integer programs. Using structural properties of the

  16. Self-insurance and worksite alcohol programs: an econometric analysis.

    Science.gov (United States)

    Kenkel, D S

    1997-03-01

    The worksite is an important point of access for alcohol treatment and prevention, but not all firms are likely to find offering alcohol programs profitable. This study attempts to identify at a conceptual and empirical level factors that are important determinants of the profitability of worksite alcohol programs. A central question considered in the empirical analysis is whether firms' decisions about worksite alcohol programs are related to how employee group health insurance is provided. The data used are from the 1992 National Survey of Worksite Health Promotion Activities (N = 1,389-1,412). The econometric analysis focuses on measures of whether the surveyed firms offer Employee Assistance Programs (EAPs), individual counseling, group classes and resource materials regarding alcohol and other substance abuse. Holding other factors constant, the probability that a self-insured firm offers an EAP is estimated to be 59%, compared to 51% for a firm that purchases market group health insurance for its employees. Unionized worksites and larger worksites are also found to be more likely to offer worksite alcohol programs, compared to nonunionized smaller worksites. Worksites with younger work-forces are less likely than those with older employees to offer alcohol programs. The empirical results are consistent with the conceptual framework from labor economics, since self-insurance is expected to increase firms' demand for worksite alcohol programs while large worksite is expected to reduce the average program cost. The role of union status and workforce age suggests it is important to consider workers' preferences for the programs as fringe benefits. The results also suggest that the national trend towards self-insurance may be leading to more prevention and treatment of worker alcohol-related problems.

  17. Dynamic analysis program for frame structure

    International Nuclear Information System (INIS)

    Ando, Kozo; Chiba, Toshio

    1975-01-01

    A general purpose computer program named ISTRAN/FD (Isub(HI) STRucture ANalysis/Frame structure, Dynamic analysis) has been developed for dynamic analysis of three-dimensional frame structures. This program has functions of free vibration analysis, seismic response analysis, graphic display by plotter and CRT, etc. This paper introduces ISTRAN/FD; examples of its application are shown with various problems : idealization of the cantilever, dynamic analysis of the main tower of the suspension bridge, three-dimensional vibration in the plate girder bridge, seismic response in the boiler steel structure, and dynamic properties of the underground LNG tank. In this last example, solid elements, in addition to beam elements, are especially used for the analysis. (auth.)

  18. Hierarchical Scheduling Framework Based on Compositional Analysis Using Uppaal

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; David, Alexandre; Kim, Jin Hyun

    2014-01-01

    This paper introduces a reconfigurable compositional scheduling framework, in which the hierarchical structure, the scheduling policies, the concrete task behavior and the shared resources can all be reconfigured. The behavior of each periodic preemptive task is given as a list of timed actions, ...

  19. Toward Solving the Problem of Problem Solving: An Analysis Framework

    Science.gov (United States)

    Roesler, Rebecca A.

    2016-01-01

    Teaching is replete with problem solving. Problem solving as a skill, however, is seldom addressed directly within music teacher education curricula, and research in music education has not examined problem solving systematically. A framework detailing problem-solving component skills would provide a needed foundation. I observed problem solving…

  20. Mediation Analysis in a Latent Growth Curve Modeling Framework

    Science.gov (United States)

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  1. Analysis of Idiom Variation in the Framework of Linguistic Subjectivity

    Science.gov (United States)

    Liu, Zhengyuan

    2012-01-01

    Idiom variation is a ubiquitous linguistic phenomenon which has raised a lot of research questions. The past approach was either formal or functional. Both of them did not pay much attention to cognitive factors of language users. By putting idiom variation in the framework of linguistic subjectivity, we have offered a new perspective in the…

  2. Comparative Analysis of Language Minorities: A Sociopolitical Framework.

    Science.gov (United States)

    Anderson, A. B.

    1990-01-01

    Synthesizes theoretical typologies in the fields of ethnic relations, ethnonationalism, and sociolinguistics into a sociopolitical framework for analyzing various types of ethnolinguistic minority situations. Particular reference is made to minority situations in Europe, North America, and developing countries. (35 references) (Author/CB)

  3. Design and Analysis of a Service Migration Framework

    DEFF Research Database (Denmark)

    Saeed, Aamir; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2013-01-01

    on another device. For such a need, an architecture is proposed to design and develop applications that migrate from one device to another and resume its operation. A simple application was constructed based on the proposed framework. Experiments were carried out to demonstrate its applicability...

  4. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Bloyd, C.; Camp, J.; Conzelmann, G. [and others

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  5. The nuclear analysis program at MURR

    International Nuclear Information System (INIS)

    Glascock, M.D.

    1993-01-01

    The University of Missouri-Columbia (MU) has continually upgraded research facilities and programs at the MU research reactor (MURR) throughout its 26-yr history. The Nuclear Analysis Program (NAP) area has participated in these upgrades over the years. As one of the largest activation analysis laboratories on a university campus, the activities of the NAP are broadly representative of the diversity of applications for activation analysis and related nuclear science. This paper describes the MURR's NAP and several of the research, education, and service projects in which the laboratory is currently engaged

  6. Energy Analysis Program 1990 annual report

    International Nuclear Information System (INIS)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ''Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings

  7. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  8. A model-based framework for the analysis of team communication in nuclear power plants

    International Nuclear Information System (INIS)

    Chung, Yun Hyung; Yoon, Wan Chul; Min, Daihwan

    2009-01-01

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants

  9. Implementation and Evaluation of Technology Mentoring Program Developed for Teacher Educators: A 6M-Framework

    Directory of Open Access Journals (Sweden)

    Selim Gunuc

    2015-06-01

    Full Text Available The purpose of this basic research is to determine the problems experienced in the Technology Mentoring Program (TMP, and the study discusses how these problems affect the process in general. The implementation was carried out with teacher educators in the education faculty. 8 doctorate students (mentors provided technology mentoring implementation for one academic term to 9 teacher educators (mentees employed in the Education Faculty. The data were collected via the mentee and the mentor interview form, mentor reflections and organization meeting reflections. As a result, the problems based on the mentor, on the mentee and on the organization/institution were determined. In order to carry out TMP more effectively and successfully, a 6M-framework (Modifying, Meeting, Matching, Managing, Mentoring - Monitoring was suggested within the scope of this study. It could be stated that fewer problems will be encountered and that the process will be carried out more effectively and successfully when the structure in this framework is taken into consideration.

  10. A methodological approach and framework for sustainability assessment in NGO-implemented primary health care programs.

    Science.gov (United States)

    Sarriot, Eric G; Winch, Peter J; Ryan, Leo J; Bowie, Janice; Kouletio, Michelle; Swedberg, Eric; LeBan, Karen; Edison, Jay; Welch, Rikki; Pacqué, Michel C

    2004-01-01

    An estimated 10.8 million children under 5 continue to die each year in developing countries from causes easily treatable or preventable. Non governmental organizations (NGOs) are frontline implementers of low-cost and effective child health interventions, but their progress toward sustainable child health gains is a challenge to evaluate. This paper presents the Child Survival Sustainability Assessment (CSSA) methodology--a framework and process--to map progress towards sustainable child health from the community level and upward. The CSSA was developed with NGOs through a participatory process of research and dialogue. Commitment to sustainability requires a systematic and systemic consideration of human, social and organizational processes beyond a purely biomedical perspective. The CSSA is organized around three interrelated dimensions of evaluation: (1) health and health services; (2) capacity and viability of local organizations; (3) capacity of the community in its social ecological context. The CSSA uses a participatory, action-planning process, engaging a 'local system' of stakeholders in the contextual definition of objectives and indicators. Improved conditions measured in the three dimensions correspond to progress toward a sustainable health situation for the population. This framework opens new opportunities for evaluation and research design and places sustainability at the center of primary health care programming.

  11. Assessing environmental assets for health promotion program planning: a practical framework for health promotion practitioners.

    Science.gov (United States)

    Springer, Andrew E; Evans, Alexandra E

    2016-01-01

    Conducting a health needs assessment is an important if not essential first step for health promotion planning. This paper explores how health needs assessments may be further strengthened for health promotion planning via an assessment of environmental assets rooted in the multiple environments (policy, information, social and physical environments) that shape health and behavior. Guided by a behavioral-ecological perspective- one that seeks to identify environmental assets that can influence health behavior, and an implementation science perspective- one that seeks to interweave health promotion strategies into existing environmental assets, we present a basic framework for assessing environmental assets and review examples from the literature to illustrate the incorporation of environmental assets into health program design. Health promotion practitioners and researchers implicitly identify and apply environmental assets in the design and implementation of health promotion interventions;this paper provides foundation for greater intentionality in assessing environmental assets for health promotion planning.

  12. Tatool: a Java-based open-source programming framework for psychological studies.

    Science.gov (United States)

    von Bastian, Claudia C; Locher, André; Ruflin, Michael

    2013-03-01

    Tatool (Training and Testing Tool) was developed to assist researchers with programming training software, experiments, and questionnaires. Tatool is Java-based, and thus is a platform-independent and object-oriented framework. The architecture was designed to meet the requirements of experimental designs and provides a large number of predefined functions that are useful in psychological studies. Tatool comprises features crucial for training studies (e.g., configurable training schedules, adaptive training algorithms, and individual training statistics) and allows for running studies online via Java Web Start. The accompanying "Tatool Online" platform provides the possibility to manage studies and participants' data easily with a Web-based interface. Tatool is published open source under the GNU Lesser General Public License, and is available at www.tatool.ch.

  13. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  14. A Statistical Framework for the Functional Analysis of Metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Sharon, Itai; Pati, Amrita; Markowitz, Victor; Pinter, Ron Y.

    2008-10-01

    Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements. They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.

  15. Towards an oral healthcare framework and policy analysis for Swaziland

    OpenAIRE

    Mndzebele, Samuel

    2010-01-01

    Background and Rationale: A synopsis by the researcher suggested that caries was becoming a public health problem among the youth, hence there was a need for deeper investigations which would lead to possible oral health interventions. Purpose: The purpose of the study was to assess dental care practices and experiences among teenagers in the Northern region of Swaziland. Based on the outcomes and views from health professionals; develop a framework for oral healthcare delivery and ...

  16. WWW-based remote analysis framework for UniSampo and Shaman analysis software

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Ala-Heikkilae, J.J.; Routti, J.T.; Nikkinen, M.T.

    2005-01-01

    UniSampo and Shaman are well-established analytical tools for gamma-ray spectrum analysis and the subsequent radionuclide identification. These tools are normally run locally on a Unix or Linux workstation in interactive mode. However, it is also possible to run them in batch/non-interactive mode by starting them with the correct parameters. This is how they are used in the standard analysis pipeline operation. This functionality also makes it possible to use them for remote operation over the network. Framework for running UniSampo and Shaman analysis using the standard WWW-protocol has been developed. A WWW-server receives requests from the client WWW-browser and runs the analysis software via a set of CGI-scripts. Authentication, input data transfer, and output and display of the final analysis results is all carried out using standard WWW-mechanisms. This WWW-framework can be utilized, for example, by organizations that have radioactivity surveillance stations in a wide area. A computer with a standard internet/intranet connection suffices for on-site analyses. (author)

  17. Globalization and health: a framework for analysis and action.

    Science.gov (United States)

    Woodward, D.; Drager, N.; Beaglehole, R.; Lipson, D.

    2001-01-01

    Globalization is a key challenge to public health, especially in developing countries, but the linkages between globalization and health are complex. Although a growing amount of literature has appeared on the subject, it is piecemeal, and suffers from a lack of an agreed framework for assessing the direct and indirect health effects of different aspects of globalization. This paper presents a conceptual framework for the linkages between economic globalization and health, with the intention that it will serve as a basis for synthesizing existing relevant literature, identifying gaps in knowledge, and ultimately developing national and international policies more favourable to health. The framework encompasses both the indirect effects on health, operating through the national economy, household economies and health-related sectors such as water, sanitation and education, as well as more direct effects on population-level and individual risk factors for health and on the health care system. Proposed also is a set of broad objectives for a programme of action to optimize the health effects of economic globalization. The paper concludes by identifying priorities for research corresponding with the five linkages identified as critical to the effects of globalization on health. PMID:11584737

  18. The Tracking and Analysis Framework (TAF): A tool for the integrated assessment of acid deposition

    International Nuclear Information System (INIS)

    Bloyd, C.N.; Henrion, M.; Marnicio, R.J.

    1995-01-01

    A major challenge that has faced policy makers concerned with acid deposition is obtaining an integrated view of the underlying science related to acid deposition. In response to this challenge, the US Department of Energy is sponsoring the development of an integrated Tracking and Analysis Framework (TAF) which links together the key acid deposition components of emissions, air transport, atmospheric deposition, and aquatic effects in a single modeling structure. The goal of TAF is to integrate credible models of the scientific and technical issues into an assessment framework that can directly address key policy issues, and in doing so act as a bridge between science and policy. Key objectives of TAF are to support coordination and communication among scientific researchers; to support communications with policy makers, and to provide rapid response for analyzing newly emerging policy issues; and to provide guidance for prioritizing research programs. This paper briefly describes how TAF was formulated to meet those objectives and the underlying principals which form the basis for its development

  19. Debugging Nondeterministic Failures in Linux Programs through Replay Analysis

    Directory of Open Access Journals (Sweden)

    Shakaiba Majeed

    2018-01-01

    Full Text Available Reproducing a failure is the first and most important step in debugging because it enables us to understand the failure and track down its source. However, many programs are susceptible to nondeterministic failures that are hard to reproduce, which makes debugging extremely difficult. We first address the reproducibility problem by proposing an OS-level replay system for a uniprocessor environment that can capture and replay nondeterministic events needed to reproduce a failure in Linux interactive and event-based programs. We then present an analysis method, called replay analysis, based on the proposed record and replay system to diagnose concurrency bugs in such programs. The replay analysis method uses a combination of static analysis, dynamic tracing during replay, and delta debugging to identify failure-inducing memory access patterns that lead to concurrency failure. The experimental results show that the presented record and replay system has low-recording overhead and hence can be safely used in production systems to catch rarely occurring bugs. We also present few concurrency bug case studies from real-world applications to prove the effectiveness of the proposed bug diagnosis framework.

  20. Framework for Financial Ratio Analysis of Audited Federal Financial Reports

    National Research Council Canada - National Science Library

    Brady, Richard

    1999-01-01

    .... The disclosure of this type of information, it was believed, would enable decision-makers to understand the financial implications of budgetary, policy and program issues and provide an analytical...

  1. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  2. Integrating Poverty and Environmental Concerns into Value-Chain Analysis: A Strategic Framework and Practical Guide

    DEFF Research Database (Denmark)

    Riisgaard, Lone; Bolwig, Simon; Ponte, Stefano

    2010-01-01

    This article aims to guide the design and implementation of action-research projects in value-chain analysis by presenting a strategic framework focused on small producers and trading and processing firms in developing countries. Its stepwise approach – building on the conceptual framework set ou...... purpose of increasing the rewards and/or reducing the risks....

  3. Barriers to renewable energy penetration. A framework for analysis

    DEFF Research Database (Denmark)

    Painuly, Jyoti P.

    2001-01-01

    Renewable energy has the potential to play an important role in providing energy with sustainability to the vast populations in developing countries who as yet have no access to clean energy. Although economically viable fur several applications, renewable energy has not been able to realise its...... potential due to several barriers to its penetration. A framework has been developed in this paper to identify the barriers to renewable energy penetration acid to suggest measures to overcome them. (C) 2001 Elsevier Science Ltd. All rights reserved....

  4. iterClust: a statistical framework for iterative clustering analysis.

    Science.gov (United States)

    Ding, Hongxu; Wang, Wanxin; Califano, Andrea

    2018-03-22

    In a scenario where populations A, B1 and B2 (subpopulations of B) exist, pronounced differences between A and B may mask subtle differences between B1 and B2. Here we present iterClust, an iterative clustering framework, which can separate more pronounced differences (e.g. A and B) in starting iterations, followed by relatively subtle differences (e.g. B1 and B2), providing a comprehensive clustering trajectory. iterClust is implemented as a Bioconductor R package. andrea.califano@columbia.edu, hd2326@columbia.edu. Supplementary information is available at Bioinformatics online.

  5. Global/local methods research using a common structural analysis framework

    Science.gov (United States)

    Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.

    1991-01-01

    Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.

  6. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  7. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    Science.gov (United States)

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  8. Framework for applying probabilistic safety analysis in nuclear regulation

    International Nuclear Information System (INIS)

    Dimitrijevic, V.B.

    1997-01-01

    The traditional regulatory framework has served well to assure the protection of public health and safety. It has been recognized, however, that in a few circumstances, this deterministic framework has lead to an extensive expenditure on matters hat have little to do with the safe and reliable operation of the plant. Developments of plant-specific PSA have offered a new and powerful analytical tool in the evaluation of the safety of the plant. Using PSA insights as an aid to decision making in the regulatory process is now known as 'risk-based' or 'risk-informed' regulation. Numerous activities in the U.S. nuclear industry are focusing on applying this new approach to modify regulatory requirements. In addition, other approaches to regulations are in the developmental phase and are being evaluated. One is based on the performance monitoring and results and it is known as performance-based regulation. The other, called the blended approach, combines traditional deterministic principles with PSA insights and performance results. (author)

  9. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  10. SIMS analysis: Development and evaluation program summary

    International Nuclear Information System (INIS)

    Groenewold, G.S.; Appelhans, A.D.; Ingram, J.C.; Delmore, J.E.; Dahl, D.A.

    1996-11-01

    This report provides an overview of the ''SIMS Analysis: Development and Evaluation Program'', which was executed at the Idaho National Engineering Laboratory from mid-FY-92 to the end of FY-96. It should be noted that prior to FY-1994 the name of the program was ''In-Situ SIMS Analysis''. This report will not go into exhaustive detail regarding program accomplishments, because this information is contained in annual reports which are referenced herein. In summary, the program resulted in the design and construction of an ion trap secondary ion mass spectrometer (IT-SIMS), which is capable of the rapid analysis of environmental samples for adsorbed surface contaminants. This instrument achieves efficient secondary ion desorption by use of a molecular, massive ReO 4 - primary ion particle. The instrument manages surface charge buildup using a self-discharging principle, which is compatible with the pulsed nature of the ion trap. The instrument can achieve high selectivity and sensitivity using its selective ion storage and MS/MS capability. The instrument was used for detection of tri-n-butyl phosphate, salt cake (tank cake) characterization, and toxic metal speciation studies (specifically mercury). Technology transfer was also an important component of this program. The approach that was taken toward technology transfer was that of component transfer. This resulted in transfer of data acquisition and instrument control software in FY-94, and ongoing efforts to transfer primary ion gun and detector technology to other manufacturers

  11. A framework for understanding international medical graduate challenges during transition into fellowship programs.

    Science.gov (United States)

    Sockalingam, Sanjeev; Khan, Attia; Tan, Adrienne; Hawa, Raed; Abbey, Susan; Jackson, Timothy; Zaretsky, Ari; Okrainec, Allan

    2014-01-01

    Previous studies have highlighted unique needs of international medical graduates (IMG) during their transition into medical training programs; however, limited data exist on IMG needs specific to fellowship training. We conducted the following mixed-method study to determine IMG fellow training needs during the transition into fellowship training programs in psychiatry and surgery. The authors conducted a mixed-methods study consisting of an online survey of IMG fellows and their supervisors in psychiatry or surgery fellowship training programs and individual interviews of IMG fellows. The survey assessed (a) fellows' and supervisors' perceptions on IMG challenges in clinical communication, health systems, and education domains and (b) past orientation initiatives. In the second phase of the study, IMG fellows were interviewed during the latter half of their fellowship training, and perceptions regarding orientation and adaptation to fellowship in Canada were assessed. Survey data were analyzed using descriptive and Mann-Whitney U statistics. Qualitative interviews were analyzed using grounded theory methodology. The survey response rate was 76% (35/46) and 69% (35/51) for IMG fellows and supervisors, respectively. Fellows reported the greatest difficulty with adapting to the hospital system, medical documentation, and balancing one's professional and personal life. Supervisors believed that fellows had the greatest difficulty with managing language and slang in Canada, the healthcare system, and an interprofessional team. In Phase 2, fellows generated themes of disorientation, disconnection, interprofessional team challenges, a need for IMG fellow resources, and a benefit from training in a multicultural setting. Our study results highlight the need for IMG specific orientation resources for fellows and supervisors. Maslow's Hierarchy of Needs may be a useful framework for understanding IMG training needs.

  12. A conceptual framework for formulating a focused and cost-effective fire protection program based on analyses of risk and the dynamics of fire effects

    International Nuclear Information System (INIS)

    Dey, M.K.

    1999-01-01

    This paper proposes a conceptual framework for developing a fire protection program at nuclear power plants based on probabilistic risk analysis (PRA) of fire hazards, and modeling the dynamics of fire effects. The process for categorizing nuclear power plant fire areas based on risk is described, followed by a discussion of fire safety design methods that can be used for different areas of the plant, depending on the degree of threat to plant safety from the fire hazard. This alternative framework has the potential to make programs more cost-effective, and comprehensive, since it will allow a more systematic and broader examination of fire risk, and provide a means to distinguish between high and low risk fire contributors. (orig.)

  13. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  14. A Demonstrative Analysis of News Articles Using Fairclough’s Critical Discourse Analysis Framework

    Directory of Open Access Journals (Sweden)

    Roy Randy Y. Briones

    2017-07-01

    Full Text Available This paper attempts to demonstrate Norman Fairclough’s Critical Discourse Analysis (CDA framework by conducting internal and external level analyses on two online news articles that report on the Moro Islamic Liberation Front’s (MILF submission of its findings on the “Mamasapano Incident” that happened in the Philippines in 2015. In performing analyses using this framework, the social context and background for these texts, as well as the relationship between the internal discourse features and the external social practices and structures in which the texts were produced are thoroughly examined. As a result, it can be noted that from the texts’ internal discourse features, the news articles portray ideological and social distinctions among social actors such as the Philippine Senate, the SAF troopers, the MILF, the MILF fighters, and the civilians. Moreover, from the viewpoint of the texts as being external social practices, the texts maintain institutional identities as news reports, but they also reveal some evaluative stance as exemplified by the adjectival phrases that the writers employed. Having both the internal and external features examined, it can be said that the way these texts were written seems to portray power relations that exist between the Philippine government and the MILF. Key words: Critical Discourse Analysis, discourse analysis, news articles, social practices, social structures, power relations

  15. A unified framework of descent algorithms for nonlinear programs and variational inequalities

    International Nuclear Information System (INIS)

    Patriksson, M.

    1993-01-01

    We present a framework of algorithms for the solution of continuous optimization and variational inequality problems. In the general algorithm, a search direction finding auxiliary problems is obtained by replacing the original cost function with an approximating monotone cost function. The proposed framework encompasses algorithm classes presented earlier by Cohen, Dafermos, Migdalas, and Tseng, and includes numerous descent and successive approximation type methods, such as Newton methods, Jacobi and Gauss-Siedel type decomposition methods for problems defined over Cartesian product sets, and proximal point methods, among others. The auxiliary problem of the general algorithm also induces equivalent optimization reformulation and descent methods for asymmetric variational inequalities. We study the convergence properties of the general algorithm when applied to unconstrained optimization, nondifferentiable optimization, constrained differentiable optimization, and variational inequalities; the emphasis of the convergence analyses is placed on basic convergence results, convergence using different line search strategies and truncated subproblem solutions, and convergence rate results. This analysis offer a unification of known results; moreover, it provides strengthenings of convergence results for many existing algorithms, and indicates possible improvements of their realizations. 482 refs

  16. Damage analysis and fundamental studies program

    International Nuclear Information System (INIS)

    Doran, D.G.; Farrar, H. IV; Goland, A.N.

    1978-01-01

    The Damage Analysis and Fundamental Studies (DAFS) Task Group has been formed by the Office of Fusion Energy to develop procedures for applying data obtained in various irradiation test facilities to projected fusion environments. A long-range program plan has been prepared and implementation has begun. The plan and technical status are briefly described

  17. Counter Trafficking System Development "Analysis Training Program"

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Dennis C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-12-01

    This document will detail the training curriculum for the Counter-Trafficking System Development (CTSD) Analysis Modules and Lesson Plans are derived from the United States Military, Department of Energy doctrine and Lawrence Livermore National Laboratory (LLNL), Global Security (GS) S Program.

  18. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1985-01-01

    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  19. Learner Analysis Framework for Globalized E-Learning

    Science.gov (United States)

    Saxena, Mamta

    2010-01-01

    The digital shift to technology-mediated modes of instructional delivery and the increased global connectivity has led to the rise in globalized e-learning programs. Educational institutions face multiple challenges as they seek to design effective, engaging and culturally competent instruction for an increasingly diverse learner population. The…

  20. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  1. Expenditure Analysis of HIV Testing and Counseling Services Using the Cascade Framework in Vietnam.

    Directory of Open Access Journals (Sweden)

    Van Thu Nguyen

    Full Text Available Currently, HIV testing and counseling (HTC services in Vietnam are primarily funded by international sources. However, international funders are now planning to withdraw their support and the Government of Vietnam (GVN is seeking to identify domestic funding and generate client fees to continue services. A clear understanding of the cost to sustain current HTC services is becoming increasingly important to facilitate planning that can lead to making HTC and other HIV services more affordable and sustainable in Vietnam. The objectives of this analysis were to provide a snapshot of current program costs to achieve key program outcomes including 1 testing and identifying PLHIV unaware of their HIV status and 2 successfully enrolling HIV (+ clients in care.We reviewed expenditure data reported by 34 HTC sites in nine Vietnamese provinces over a one-year period from October 2012 to September 2013. Data on program outcomes were extracted from the HTC database of 42,390 client records. Analysis was carried out from the service providers' perspective.The mean expenditure for a single client provided HTC services (testing, receiving results and referral for care/treatment was US $7.6. The unit expenditure per PLHIV identified through these services varied widely from US $22.8 to $741.5 (median: $131.8. Excluding repeat tests, the range for expenditure to newly diagnose a PLHIV was even wider (from US $30.8 to $1483.0. The mean expenditure for one successfully referred HIV client to care services was US $466.6. Personnel costs contributed most to the total cost.Our analysis found a wide range of expenditures by site for achieving the same outcomes. Re-designing systems to provide services at the lowest feasible cost is essential to making HIV services more affordable and treatment for prevention programs feasible in Vietnam. The analysis also found that understanding the determinants and reasons for variance in service costs by site is an important

  2. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    Science.gov (United States)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  3. Economic and Nonproliferation Analysis Framework for Assessing Reliable Nuclear Fuel Service Arrangements

    International Nuclear Information System (INIS)

    Phillips, Jon R.; Kreyling, Sean J.; Short, Steven M.; Weimar, Mark R.

    2010-01-01

    Nuclear power is now broadly recognized as an essential technology in national strategies to provide energy security while meeting carbon management goals. Yet a long standing conundrum remains: how to enable rapid growth in the global nuclear power infrastructure while controlling the spread of sensitive enrichment and reprocessing technologies that lie at the heart of nuclear fuel supply and nuclear weapons programs. Reducing the latent proliferation risk posed by a broader horizontal spread of enrichment and reprocessing technology has been a primary goal of national nuclear supplier policies since the beginning of the nuclear power age. Attempts to control the spread of sensitive nuclear technology have been the subject of numerous initiatives in the intervening decades sometimes taking the form of calls to develop fuel supply and service assurances to reduce market pull to increase the number of states with fuel cycle capabilities. A clear understanding of what characteristics of specific reliable nuclear fuel service (RNFS) and supply arrangements qualify them as 'attractive offers' is critical to the success of current and future efforts. At a minimum, RNFS arrangements should provide economic value to all participants and help reduce latent proliferation risks posed by the global expansion of nuclear power. In order to inform the technical debate and the development of policy, Pacific Northwest National Laboratory has been developing an analytical framework to evaluate the economics and nonproliferation merits of alternative approaches to RNFS arrangements. This paper provides a brief overview of the economic analysis framework developed and applied to a model problem of current interest: full-service nuclear fuel leasing arrangements. Furthermore, this paper presents an extended outline of a proposed analysis approach to evaluate the non-proliferation merits of various RNFS alternatives.

  4. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  5. Using the RE-AIM framework to evaluate physical activity public health programs in México.

    Science.gov (United States)

    Jauregui, Edtna; Pacheco, Ann M; Soltero, Erica G; O'Connor, Teresia M; Castro, Cynthia M; Estabrooks, Paul A; McNeill, Lorna H; Lee, Rebecca E

    2015-02-19

    Physical activity (PA) public health programming has been widely used in Mexico; however, few studies have documented individual and organizational factors that might be used to evaluate their public health impact. The RE-AIM framework is an evaluation tool that examines individual and organizational factors of public health programs. The purpose of this study was to use the RE-AIM framework to determine the degree to which PA programs in Mexico reported individual and organizational factors and to investigate whether reporting differed by the program's funding source. Public health programs promoting PA were systematically identified during 2008-2013 and had to have an active program website. Initial searches produced 23 possible programs with 12 meeting inclusion criteria. A coding sheet was developed to capture behavioral, outcome and RE-AIM indicators from program websites. In addition to targeting PA, five (42%) programs also targeted dietary habits and the most commonly reported outcome was change in body composition (58%). Programs reported an average of 11.1 (±3.9) RE-AIM indicator items (out of 27 total). On average, 45% reported reach indicators, 34% reported efficacy/effectiveness indicators, 60% reported adoption indicators, 40% reported implementation indicators, and 35% reported maintenance indicators. The proportion of RE-AIM indicators reported did not differ significantly for programs that were government supported (M = 10, SD = 3.1) and programs that were partially or wholly privately or corporately supported (M = 12.0, SD = 4.4). While reach and adoption of these programs were most commonly reported, there is a need for stronger evaluation of behavioral and health outcomes before the public health impact of these programs can be established.

  6. An in-depth analysis of theoretical frameworks for the study of care coordination

    Directory of Open Access Journals (Sweden)

    Sabine Van Houdt

    2013-06-01

    Full Text Available Introduction: Complex chronic conditions often require long-term care from various healthcare professionals. Thus, maintaining quality care requires care coordination. Concepts for the study of care coordination require clarification to develop, study and evaluate coordination strategies. In 2007, the Agency for Healthcare Research and Quality defined care coordination and proposed five theoretical frameworks for exploring care coordination. This study aimed to update current theoretical frameworks and clarify key concepts related to care coordination. Methods: We performed a literature review to update existing theoretical frameworks. An in-depth analysis of these theoretical frameworks was conducted to formulate key concepts related to care coordination.Results: Our literature review found seven previously unidentified theoretical frameworks for studying care coordination. The in-depth analysis identified fourteen key concepts that the theoretical frameworks addressed. These were ‘external factors’, ‘structure’, ‘tasks characteristics’, ‘cultural factors’, ‘knowledge and technology’, ‘need for coordination’, ‘administrative operational processes’, ‘exchange of information’, ‘goals’, ‘roles’, ‘quality of relationship’, ‘patient outcome’, ‘team outcome’, and ‘(interorganizational outcome’.Conclusion: These 14 interrelated key concepts provide a base to develop or choose a framework for studying care coordination. The relational coordination theory and the multi-level framework are interesting as these are the most comprehensive.

  7. A novel joint analysis framework improves identification of differentially expressed genes in cross disease transcriptomic analysis

    Directory of Open Access Journals (Sweden)

    Wenyi Qin

    2018-02-01

    Full Text Available Abstract Motivation Detecting differentially expressed (DE genes between disease and normal control group is one of the most common analyses in genome-wide transcriptomic data. Since most studies don’t have a lot of samples, researchers have used meta-analysis to group different datasets for the same disease. Even then, in many cases the statistical power is still not enough. Taking into account the fact that many diseases share the same disease genes, it is desirable to design a statistical framework that can identify diseases’ common and specific DE genes simultaneously to improve the identification power. Results We developed a novel empirical Bayes based mixture model to identify DE genes in specific study by leveraging the shared information across multiple different disease expression data sets. The effectiveness of joint analysis was demonstrated through comprehensive simulation studies and two real data applications. The simulation results showed that our method consistently outperformed single data set analysis and two other meta-analysis methods in identification power. In real data analysis, overall our method demonstrated better identification power in detecting DE genes and prioritized more disease related genes and disease related pathways than single data set analysis. Over 150% more disease related genes are identified by our method in application to Huntington’s disease. We expect that our method would provide researchers a new way of utilizing available data sets from different diseases when sample size of the focused disease is limited.

  8. Static Analysis of Lockless Microcontroller C Programs

    Directory of Open Access Journals (Sweden)

    Eva Beckschulze

    2012-11-01

    Full Text Available Concurrently accessing shared data without locking is usually a subject to race conditions resulting in inconsistent or corrupted data. However, there are programs operating correctly without locking by exploiting the atomicity of certain operations on a specific hardware. In this paper, we describe how to precisely analyze lockless microcontroller C programs with interrupts by taking the hardware architecture into account. We evaluate this technique in an octagon-based value range analysis using access-based localization to increase efficiency.

  9. Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.

    Science.gov (United States)

    Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha

    2017-07-01

    As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well

  10. The Use of the Data-to-Action Framework in the Evaluation of CDC's DELTA FOCUS Program.

    Science.gov (United States)

    Armstead, Theresa L; Kearns, Megan; Rambo, Kirsten; Estefan, Lianne Fuino; Dills, Jenny; Rivera, Moira S; El-Beshti, Rasha

    The Centers for Disease Control and Prevention's (CDC's) Domestic Violence Prevention Enhancements and Leadership Through Alliances, Focusing on Outcomes for Communities United with States (DELTA FOCUS) program is a 5-year cooperative agreement (2013-2018) funding 10 state domestic violence coalitions and local coordinated community response teams to engage in primary prevention of intimate partner violence. Grantees' prevention strategies were often developmental and emergent; therefore, CDC's approach to program oversight, administration, and support to grantees required a flexible approach. CDC staff adopted a Data-to-Action Framework for the DELTA FOCUS program evaluation that supported a culture of learning to meet dynamic and unexpected information needs. Briefly, a Data-to-Action Framework involves the collection and use of information in real time for program improvement. Utilizing this framework, the DELTA FOCUS data-to-action process yielded important insights into CDC's ongoing technical assistance, improved program accountability by providing useful materials, and information for internal agency leadership, and helped build a learning community among grantees. CDC and other funders, as decision makers, can promote program improvements that are data-informed by incorporating internal processes supportive of ongoing data collection and review.

  11. CREATION OF IT-ORIENTED ONTOLOGICAL FRAMEWORK FOR THE PURPOSE OF MAKING EDUCATIONAL PROGRAMS ON THE BASE OF COMPETENCIES

    Directory of Open Access Journals (Sweden)

    G. M. Korotenko

    2017-08-01

    Full Text Available Purpose. Taking into account the expansion of computing application scopes there is a need to identify the links and features of the constantly emerging professional competencies of the new sections of computing knowledge to improve the process of forming new curricula. Methodology. Authors propose the new approach aimed to build specialized knowledge bases generated using artificial intelligence technology and focused on the use of multiple heterogeneous resources or data sources on specific educational topics is proposed. As a tool, ensuring the formation of the base ontology the Protégé 4.2 ontology editor is used. As one of the modules of the developed system of semantic analysis, which provides access to ontology and the possibility of its processing, the Apache Jena Java framework should be used, which forms the software environment for working with data in RDF, RDFS and OWL formats, and also supports the ability to form queries to Ontologies in the SPARQL language. The peculiarity of this approach is the binding of information resources of the three-platform presentation of the disciplinary structure in the context of identifying the links of professional competencies. Findings. The model and structure of the IT-oriented ontological framework designed to ensure the components convergence of the university three-platform information and communication environment are developed. The structure of the knowledge base ontology-basis, describing the main essence of the educational standards of the "Information Technologies" branch is formed. Originality. Within the framework of design and formation of the knowledge sector disciplinary structure "Information Technologies" in the context of the competence approach to education, the architecture of the competence descriptors of semantic analysis system is proposed. It implements the algorithm for integrating the ontological and product models of knowledge representation about the subject domain

  12. Using Campinha-Bacote's Framework to Examine Cultural Competence from an Interdisciplinary International Service Learning Program

    Science.gov (United States)

    Wall-Bassett, Elizabeth DeVane; Hegde, Archana Vasudeva; Craft, Katelyn; Oberlin, Amber Louise

    2018-01-01

    The purpose of this study was to investigate an interdisciplinary international service learning program and its impact on student sense of cultural awareness and competence using the Campinha-Bacote's (2002) framework of cultural competency model. Seven undergraduate and one graduate student from Human Development and Nutrition Science…

  13. Design and implementation of the reconstruction software for the photon multiplicity detector in object oriented programming framework

    International Nuclear Information System (INIS)

    Chattopadhayay, Subhasis; Ghosh, Premomoy; Gupta, R.; Mishra, D.; Phatak, S.C.; Sood, G.

    2002-01-01

    High granularity photon multiplicity detector (PMD) is scheduled to take data in Relativistic Heavy Ion Collision(RHIC) this year. A detailed scheme has been designed and implemented in object oriented programming framework using C++ for the monitoring and reconstruction job of PMD data

  14. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  15. The Macroeconomic Framework of Support Analysis for Sustainable Businesses Development

    Directory of Open Access Journals (Sweden)

    Constantin Mitrut

    2015-08-01

    Full Text Available The state of satisfaction of an economy results from the quality of the economic products it produces and consumes, in agreement with assuring environment protection, as a source of producing present and future economic goods, and with intensive utilising of human capital, as a source of innovation growth. Knowledge transfer happens in a sustainable economy, whose principles are rational use of resources, limiting of waste, protection, for enabling future generations to have also access to resources. The present research is based on a multifactorial liniar regression model which outlines the direct correlation between the dependent variable welfare and the independent variable of concentration measured by the Gini coefficient of wealth concentration, on the one hand, and by the GDP level, on the other hand, at the level of year 2012. The aim of this research is to identify the correlation between the indicator of quality of life satisfaction or of the welfare function at the level of EU 2012, and the assurance of a macroeconomic framework for sustainable business development.

  16. Drainage network extraction from a high-resolution DEM using parallel programming in the .NET Framework

    Science.gov (United States)

    Du, Chao; Ye, Aizhong; Gan, Yanjun; You, Jinjun; Duan, Qinyun; Ma, Feng; Hou, Jingwen

    2017-12-01

    High-resolution Digital Elevation Models (DEMs) can be used to extract high-accuracy prerequisite drainage networks. A higher resolution represents a larger number of grids. With an increase in the number of grids, the flow direction determination will require substantial computer resources and computing time. Parallel computing is a feasible method with which to resolve this problem. In this paper, we proposed a parallel programming method within the .NET Framework with a C# Compiler in a Windows environment. The basin is divided into sub-basins, and subsequently the different sub-basins operate on multiple threads concurrently to calculate flow directions. The method was applied to calculate the flow direction of the Yellow River basin from 3 arc-second resolution SRTM DEM. Drainage networks were extracted and compared with HydroSHEDS river network to assess their accuracy. The results demonstrate that this method can calculate the flow direction from high-resolution DEMs efficiently and extract high-precision continuous drainage networks.

  17. Putting program evaluation to work: a framework for creating actionable knowledge for suicide prevention practice.

    Science.gov (United States)

    Wilkins, Natalie; Thigpen, Sally; Lockman, Jennifer; Mackin, Juliette; Madden, Mary; Perkins, Tamara; Schut, James; Van Regenmorter, Christina; Williams, Lygia; Donovan, John

    2013-06-01

    The economic and human cost of suicidal behavior to individuals, families, communities, and society makes suicide a serious public health concern, both in the US and around the world. As research and evaluation continue to identify strategies that have the potential to reduce or ultimately prevent suicidal behavior, the need for translating these findings into practice grows. The development of actionable knowledge is an emerging process for translating important research and evaluation findings into action to benefit practice settings. In an effort to apply evaluation findings to strengthen suicide prevention practice, the Centers for Disease Control and Prevention (CDC) and the Substance Abuse and Mental Health Services Administration (SAMHSA) supported the development of three actionable knowledge products that make key findings and lessons learned from youth suicide prevention program evaluations accessible and useable for action. This paper describes the actionable knowledge framework (adapted from the knowledge transfer literature), the three products that resulted, and recommendations for further research into this emerging method for translating research and evaluation findings and bridging the knowledge-action gap.

  18. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  19. Framework for Financial Ratio Analysis of Audited Federal Financial Reports

    Science.gov (United States)

    1999-12-01

    this period were conducted on the statistical validity of the ratio method in financial analysis. McDonald and Morris conducted a study on the... franchising operations, allowing them to lower costs and share administrative support services with other agencies. [Ref. 60:sec. 402-403] The GMRA also...Press, Washington, D.C., 1955). 21. McDonald , Bill and Morris, Michael H., "The Statistical Validity of the Ratio Method in Financial Analysis: An

  20. AMAZON HADOOP FRAMEWORK USED IN BUSINESS FOR BIG DATA ANALYSIS

    OpenAIRE

    Ankush Verma*, Dr Neelesh Jain

    2017-01-01

    The Amazon MapReduce programming model, introduced by Amazon, a simple and efficient way of performing distributed computation over large data sets on the web especially for e-commerce. Amazon EMR work on Master/Slave Architecture using Amazon EMR for map and reduce big data. Amazon EC2 use cloud computing is a central part of designed web service that provides resizable compute capacity in the cloud. Here we also discuss about the Benefit and limitation of using Amazon EMR. Amazon S3 use eas...

  1. Energy pathway analysis - a hydrogen fuel cycle framework for system studies

    International Nuclear Information System (INIS)

    Badin, J.S.; Tagore, S.

    1997-01-01

    An analytical framework has been developed that can be used to estimate a range of life-cycle costs and impacts that result from the incremental production, storage, transport, and use of different fuels or energy carriers, such as hydrogen, electricity, natural gas, and gasoline. This information is used in a comparative analysis of energy pathways. The pathways provide the U.S. Department of Energy (DOE) with an indication of near-, mid-, and long-term technologies that have the greatest potential for advancement and can meet the cost goals. The methodology and conceptual issues are discussed. Also presented are results for selected pathways from the E3 (Energy, Economics, Emissions) Pathway Analysis Model. This model will be expanded to consider networks of pathways and to be compatible with a linear programming optimization processor. Scenarios and sets of constraints (energy demands, sources, emissions) will be defined so the effects on energy transformation activities included in the solution and on the total optimized system cost can be investigated. This evaluation will be used as a guide to eliminate technically feasible pathways if they are not cost effective or do not meet the threshold requirements for the market acceptance. (Author)

  2. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    Science.gov (United States)

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  3. Needs Analysis and Course Design; A Framework for Designing Exam Courses

    Directory of Open Access Journals (Sweden)

    Reza Eshtehardi

    2017-09-01

    Full Text Available This paper introduces a framework for designing exam courses and highlights the importance of needs analysis in designing exam courses. The main objectives of this paper are to highlight the key role of needs analysis in designing exam courses, to offer a framework for designing exam courses, to show the language needs of different students for IELTS (International English Language Testing System exam, to offer an analysis of those needs and to explain how they will be taken into account for the design of the course. First, I will concentrate on some distinguishing features in exam classes, which make them different from general English classes. Secondly, I will introduce a framework for needs analysis and diagnostic testing and highlight the importance of needs analysis for the design of syllabus and language courses. Thirdly, I will describe significant features of syllabus design, course assessment, and evaluation procedures.

  4. EU Science Diplomacy and Framework Programs as Instruments of STI Cooperation

    Directory of Open Access Journals (Sweden)

    К. А. Ibragimova

    2017-01-01

    Full Text Available This article examines the tools that the EU in interactions with third countries in the field of STI uses. The EU is a pioneer in the use of science and technology in the international arena, the creation of strategic bilateral agreements on science and technology and the conduct of political dialogues at the highest political level (at the country and regional levels. The EU actively uses its foreign policy instruments of influence, including the provision of access to its framework programs to researchers from third countries, as well as scientific diplomacy. The success of these programs and scientific diplomacy shows the effectiveness of the EU as a global actor. In its foreign policy global innovation strategy, the EU proceeds from the premise that no state in the world today can cope independently with modern global challenges such as climate change, migration, terrorism, etc. Therefore, the solution of these issues requires both an expert evaluation from an independent world scientific community, and the perseverance of diplomats and officials of branch ministries of national states capable of conveying the views of their government in international negotiations and defending national interests of the country to find a solution that suits everyone. The EU has the resources to create a "cumulative effect" by developing and applying common norms on the territory of theUnion, analyzing the innovation policies of member states and the possibility of sharing best practices. At the same time, the EU shares its vision of problems, values and priorities with partners and uses the tools of "soft power" (including its smart and normative force and scientific diplomacy in the field of STI. The soft power of the EU in the field of STI lies in the attractiveness of the EU as a research area in which it is possible to conduct modern high-quality international research with the involvement of scientific teams from different countries in both physical

  5. Conceptual Framework for Gentrification Analysis of Iskandar Malaysia

    Directory of Open Access Journals (Sweden)

    Rabiyatul Adawiyah Abd Khalil

    2015-05-01

    Full Text Available Gentrification is generally defined as the transformation of a working class living in the central city into middle-upper class society. It has both positive and negative consequences. Gentrification caused loses of affordable home, however, it is also beneficial because it rejuvenates the tax base as well stimulates mixed income. Question arises whether the characteristics of gentrification in developing countries will appear to be the same or varies to those in developed countries. Because of this research growth, a review of the body of literature related to the mutation of gentrification, i.e. type of gentrification and its characteristics is believed necessary. This will serve as a basis for a conceptual framework to analyze what is happening in Iskandar Malaysia (IM. As globalized urbanization area, IM offers a particularly interesting case as there are already signs of gentrification due to its rapid urbanization. In the residential market, house price in IM shows a rapid and continuous increment. Many foreigners are attracted to the new residential area in IM being promoted as exclusive while promising a quality lifestyle. The locals meanwhile face difficulties in owning a home because of the upward spiraling of house price.  In certain area, the local low income people are displaced by middle and upper income group. The identification of such characteristics and the associated attributes which is the second phase of the study will determine to what extent IM is in the process of gentrification. The paper finally concluded that the sign of gentrification in IM is similar to the other developing countries.

  6. Architecture of collaborating frameworks simulation, visualisation, user interface and analysis

    CERN Document Server

    Pfeier, A; Ferrero-Merlino, B; Giannitrapani, R; Longo, F; Nieminen, P; Pia, M G; Santin, G

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  7. Strategic Port Graph Rewriting: An Interactive Modelling and Analysis Framework

    Directory of Open Access Journals (Sweden)

    Maribel Fernández

    2014-07-01

    Full Text Available We present strategic portgraph rewriting as a basis for the implementation of visual modelling and analysis tools. The goal is to facilitate the specification, analysis and simulation of complex systems, using port graphs. A system is represented by an initial graph and a collection of graph rewriting rules, together with a user-defined strategy to control the application of rules. The strategy language includes constructs to deal with graph traversal and management of rewriting positions in the graph. We give a small-step operational semantics for the language, and describe its implementation in the graph transformation and visualisation tool PORGY.

  8. The social impacts of dams: A new framework for scholarly analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kirchherr, Julian, E-mail: julian.kirchherr@sant.ox.ac.uk; Charles, Katrina J., E-mail: katrina.charles@ouce.ox.ac.uk

    2016-09-15

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  9. The social impacts of dams: A new framework for scholarly analysis

    International Nuclear Information System (INIS)

    Kirchherr, Julian; Charles, Katrina J.

    2016-01-01

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  10. Developing a meaningful QA trend analysis program

    International Nuclear Information System (INIS)

    Sternberg, A.

    1987-01-01

    A trend analysis program is being developed by the nuclear quality assurance (NQA) department at Public Service Electric and Gas Company, adapted from the principles advocated by W. Edwards Deming using statistical process control methods. It deals with identifying performance indicators that monitor the activities of a process considering both inputs and outputs, determining whether the process is stable or unstable, taking actions accordingly, and continuing to monitor the process with the objective of continual improvement of quality

  11. Framework for adaptive multiscale analysis of nonhomogeneous point processes.

    Science.gov (United States)

    Helgason, Hannes; Bartroff, Jay; Abry, Patrice

    2011-01-01

    We develop the methodology for hypothesis testing and model selection in nonhomogeneous Poisson processes, with an eye toward the application of modeling and variability detection in heart beat data. Modeling the process' non-constant rate function using templates of simple basis functions, we develop the generalized likelihood ratio statistic for a given template and a multiple testing scheme to model-select from a family of templates. A dynamic programming algorithm inspired by network flows is used to compute the maximum likelihood template in a multiscale manner. In a numerical example, the proposed procedure is nearly as powerful as the super-optimal procedures that know the true template size and true partition, respectively. Extensions to general history-dependent point processes is discussed.

  12. A FRAMEWORK ANALYSIS OF EUROPEAN LABOUR MARKET POLICIES

    Directory of Open Access Journals (Sweden)

    Graţiela Georgiana Carica

    2011-03-01

    Full Text Available The purpose of the paper is to analyse European labour market policies and their integrated guidelines, by highlighting various measures that need to be adopted in order to increase labour productivity, with positive effects on long-term economic development. The paper methodizes the main conditions complied by structural reforms in order to encourage labour employment and the policies that frame a more efficient unemployment insurance system crucial to increase security while encouraging the unemployed to look for a job and to accept a job offer, respectively on flexicurity policies. We found that employment rates are generally associated with large expenses on labour market policies and with an increased number of participants to programs developed within these types of policies. The degree of influence and strong dependence between outcome and labour market policies are illustrated in various ways and discussed within the paper.

  13. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    Science.gov (United States)

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  14. Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis

    Science.gov (United States)

    Young, Cristobal; Holsteen, Katherine

    2017-01-01

    Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…

  15. Value Chain Analysis: A Framework for Management of Distance Education.

    Science.gov (United States)

    Woudstra, Andrew; Powell, Richard

    1989-01-01

    Discussion of the benefits of value chain analysis in the management of distance education organizations focuses on an example at Athabasca University. The effects of policies and decisions on the organization and its value system are considered, cost drivers for activities are described, and a future-oriented perspective is emphasized. (14…

  16. Continuous quality improvement in a Maltese hospital using logical framework analysis.

    Science.gov (United States)

    Buttigieg, Sandra C; Gauci, Dorothy; Dey, Prasanta

    2016-10-10

    Purpose The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.

  17. A framework for smartphone-enabled, patient-generated health data analysis

    Directory of Open Access Journals (Sweden)

    Shreya S. Gollamudi

    2016-08-01

    Full Text Available Background: Digital medicine and smartphone-enabled health technologies provide a novel source of human health and human biology data. However, in part due to its intricacies, few methods have been established to analyze and interpret data in this domain. We previously conducted a six-month interventional trial examining the efficacy of a comprehensive smartphone-based health monitoring program for individuals with chronic disease. This included 38 individuals with hypertension who recorded 6,290 blood pressure readings over the trial. Methods: In the present study, we provide a hypothesis testing framework for unstructured time series data, typical of patient-generated mobile device data. We used a mixed model approach for unequally spaced repeated measures using autoregressive and generalized autoregressive models, and applied this to the blood pressure data generated in this trial. Results: We were able to detect, roughly, a 2 mmHg decrease in both systolic and diastolic blood pressure over the course of the trial despite considerable intra- and inter-individual variation. Furthermore, by supplementing this finding by using a sequential analysis approach, we observed this result over three months prior to the official study end—highlighting the effectiveness of leveraging the digital nature of this data source to form timely conclusions. Conclusions: Health data generated through the use of smartphones and other mobile devices allow individuals the opportunity to make informed health decisions, and provide researchers the opportunity to address innovative health and biology questions. The hypothesis testing framework we present can be applied in future studies utilizing digital medicine technology or implemented in the technology itself to support the quantified self.

  18. Structural Analysis of Kufasat Using Ansys Program

    Science.gov (United States)

    Al-Maliky, Firas T.; AlBermani, Mohamed J.

    2018-03-01

    The current work focuses on vibration and modal analysis of KufaSat structure using ANSYS 16 program. Three types of Aluminum alloys (5052-H32, 6061-T6 and 7075-T6) were selected for investigation of the structure under design loads. Finite element analysis (FEA) in design static load of 51 g was performed. The natural frequencies for five modes were estimated using modal analysis. In order to ensure that KufaSat could withstand with various conditions during launch, the Margin of safety was calculated. The results of deformation and Von Mises stress for linear buckling analysis were also performed. The comparison of data was done to select the optimum material for KufaSat structures.

  19. Muon g-2 Reconstruction and Analysis Framework for the Muon Anomalous Precession Frequency

    Energy Technology Data Exchange (ETDEWEB)

    Khaw, Kim Siang [Washington U., Seattle

    2017-10-21

    The Muon g-2 experiment at Fermilab, with the aim to measure the muon anomalous magnetic moment to an unprecedented level of 140~ppb, has started beam and detector commissioning in Summer 2017. To deal with incoming data projected to be around tens of petabytes, a robust data reconstruction and analysis chain based on Fermilab's \\textit{art} event-processing framework is developed. Herein, I report the current status of the framework, together with its novel features such as multi-threaded algorithms for online data quality monitor (DQM) and fast-turnaround operation (nearline). Performance of the framework during the commissioning run is also discussed.

  20. 75 FR 24824 - Energy Efficiency Program for Consumer Products: Public Meeting and Availability of the Framework...

    Science.gov (United States)

    2010-05-06

    ... refrigeration equipment: Ice-cream freezers; self- contained commercial refrigerators, freezers, and... published a Rulemaking Framework for Commercial Refrigeration Equipment Including Ice-Cream Freezers; Self...

  1. Social Entrepreneurship: Framework for feasibility analysis of social business concepts

    OpenAIRE

    Groth, Ida Eikvåg; Magnussen, Line

    2011-01-01

    ABSTRACTPURPOSEWith the increased interest in social entrepreneurship demonstrated within business schools and academic environments, the adaption of existing academic entrepreneurial constructs for social entrepreneurship applications becomes relevant. The purpose of this thesis is to develop additional tools to the traditional feasibility analysis. The tools will be specifically directed at technology-based concepts, due to the increased employment of technology-based products to solve soci...

  2. Using program logic model analysis to evaluate and better deliver what works

    International Nuclear Information System (INIS)

    Megdal, Lori; Engle, Victoria; Pakenas, Larry; Albert, Scott; Peters, Jane; Jordan, Gretchen

    2005-01-01

    There is a rich history in using program theories and logic models (PT/LM) for evaluation, monitoring, and program refinement in a variety of fields, such as health care, social and education programs. The use of these tools to evaluate and improve energy efficiency programs has been growing over the last 5-7 years. This paper provides an overview of the state-of-the-art methods of logic model development, with analysis that significantly contributed to: Assessing the logic behind how the program expects to be able to meets its ultimate goals, including the 'who', the 'how', and through what mechanism. In doing so, gaps and questions that still need to be addressed can be identified. Identifying and prioritize the indicators that should be measured to evaluate the program and program theory. Determining key researchable questions that need to be answered by evaluation/research, to assess whether the mechanism assumed to cause the changes in actions, attitudes, behaviours, and business practices is workable and efficient. Also will assess the validity in the program logic and the likelihood that the program can accomplish its ultimate goals. Incorporating analysis of prior like programs and social science theories in a framework to identify opportunities for potential program refinements. The paper provides an overview of the tools, techniques and references, and uses as example the energy efficiency program analysis conducted for the New York State Energy Research and Development Authority's (NYSERDA) New York ENERGY $MART SM programs

  3. Network analysis: An innovative framework for understanding eating disorder psychopathology.

    Science.gov (United States)

    Smith, Kathryn E; Crosby, Ross D; Wonderlich, Stephen A; Forbush, Kelsie T; Mason, Tyler B; Moessner, Markus

    2018-03-01

    Network theory and analysis is an emerging approach in psychopathology research that has received increasing attention across fields of study. In contrast to medical models or latent variable approaches, network theory suggests that psychiatric syndromes result from systems of causal and reciprocal symptom relationships. Despite the promise of this approach to elucidate key mechanisms contributing to the development and maintenance of eating disorders (EDs), thus far, few applications of network analysis have been tested in ED samples. We first present an overview of network theory, review the existing findings in the ED literature, and discuss the limitations of this literature to date. In particular, the reliance on cross-sectional designs, use of single-item self-reports of symptoms, and instability of results have raised concern about the inferences that can be made from network analyses. We outline several areas to address in future ED network analytic research, which include the use of prospective designs and adoption of multimodal assessment methods. Doing so will provide a clearer understanding of whether network analysis can enhance our current understanding of ED psychopathology and inform clinical interventions. © 2018 Wiley Periodicals, Inc.

  4. Ovis: A framework for visual analysis of ocean forecast ensembles

    KAUST Repository

    Hollt, Thomas; Magdy, Ahmed; Zhan, Peng; Chen, Guoning; Gopalakrishnan, Ganesh; Hoteit, Ibrahim; Hansen, Charles D.; Hadwiger, Markus

    2014-01-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea. © 1995-2012 IEEE.

  5. Ovis: A Framework for Visual Analysis of Ocean Forecast Ensembles.

    Science.gov (United States)

    Höllt, Thomas; Magdy, Ahmed; Zhan, Peng; Chen, Guoning; Gopalakrishnan, Ganesh; Hoteit, Ibrahim; Hansen, Charles D; Hadwiger, Markus

    2014-08-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea.

  6. Ovis: A framework for visual analysis of ocean forecast ensembles

    KAUST Repository

    Hollt, Thomas

    2014-08-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea. © 1995-2012 IEEE.

  7. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  8. Accuracy of an efficient framework for structural analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert D.; Fedorov, Vladimir

    2016-01-01

    -section analysis tool is able to capture the effects stemming from material anisotropy and inhomogeneity for sections of arbitrary geometry. The proposed framework is very efficient and therefore ideally suited for integration within wind turbine aeroelastic design and analysis tools. A number of benchmark......This paper presents a novel framework for the structural design and analysis of wind turbine blades and establishes its accuracy. The framework is based on a beam model composed of two parts—a 2D finite element-based cross-section analysis tool and a 3D beam finite element model. The cross...... examples are presented comparing the results from the proposed beam model to 3D shell and solid finite element models. The examples considered include a square prismatic beam, an entire wind turbine rotor blade and a detailed wind turbine blade cross section. Phenomena at both the blade length scale...

  9. Proposed framework for the Western Area Power Administration Environmental Risk Management Program

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, C.S.; DiMassa, F.V.; Pelto, P.J.; Brothers, A.J. [Pacific Northwest Lab., Richland, WA (United States); Roybal, A.L. [Western Area Power Administration, Golden, CO (United States)

    1994-12-01

    The Western Area Power Administration (Western) views environmental protection and compliance as a top priority as it manages the construction, operation, and maintenance of its vast network of transmission lines, substations, and other facilities. A recent Department of Energy audit of Western`s environmental management activities recommends that Western adopt a formal environmental risk program. To accomplish this goal, Western, in conjunction with Pacific Northwest Laboratory, is in the process of developing a centrally coordinated environmental risk program. This report presents the results of this design effort, and indicates the direction in which Western`s environmental risk program is heading. Western`s environmental risk program will consist of three main components: risk communication, risk assessment, and risk management/decision making. Risk communication is defined as an exchange of information on the potential for threats to human health, public safety, or the environment. This information exchange provides a mechanism for public involvement, and also for the participation in the risk assessment and management process by diverse groups or offices within Western. The objective of risk assessment is to evaluate and rank the relative magnitude of risks associated with specific environmental issues that are facing Western. The evaluation and ranking is based on the best available scientific information and judgment and serves as input to the risk management process. Risk management takes risk information and combines it with relevant non-risk factors (e.g., legal mandates, public opinion, costs) to generate risk management options. A risk management tool, such as decision analysis, can be used to help make risk management choices.

  10. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  11. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  12. A threat analysis framework as applied to critical infrastructures in the Energy Sector.

    Energy Technology Data Exchange (ETDEWEB)

    Michalski, John T.; Duggan, David Patrick

    2007-09-01

    The need to protect national critical infrastructure has led to the development of a threat analysis framework. The threat analysis framework can be used to identify the elements required to quantify threats against critical infrastructure assets and provide a means of distributing actionable threat information to critical infrastructure entities for the protection of infrastructure assets. This document identifies and describes five key elements needed to perform a comprehensive analysis of threat: the identification of an adversary, the development of generic threat profiles, the identification of generic attack paths, the discovery of adversary intent, and the identification of mitigation strategies.

  13. The accident analysis in the framework of emergency provisions

    International Nuclear Information System (INIS)

    Tietze, A.

    1981-03-01

    The first part of the report describes the demands on and bases of a reactor emergency plan and outlines the technical characteristics of a nuclear power plant with light-water moderated pressurized-water reactor with special regard to reactor safety. In the second part the failure and risk potentials of a pressurized-water plant are described and discussed. The third part is dedicated to a representation of the analytical method in a stricter sense, according to the current state of technology. Finally the current degree of effectiveness of the reactor accident analysis method is critically discussed and perspectives of future development are pointed out. (orig.) [de

  14. Hanford Site Composite Analysis Technical Approach Description: Integrated Computational Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, K. J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needs if potential problems are identified.

  15. Optimizing an estuarine water quality monitoring program through an entropy-based hierarchical spatiotemporal Bayesian framework

    Science.gov (United States)

    Alameddine, Ibrahim; Karmakar, Subhankar; Qian, Song S.; Paerl, Hans W.; Reckhow, Kenneth H.

    2013-10-01

    The total maximum daily load program aims to monitor more than 40,000 standard violations in around 20,000 impaired water bodies across the United States. Given resource limitations, future monitoring efforts have to be hedged against the uncertainties in the monitored system, while taking into account existing knowledge. In that respect, we have developed a hierarchical spatiotemporal Bayesian model that can be used to optimize an existing monitoring network by retaining stations that provide the maximum amount of information, while identifying locations that would benefit from the addition of new stations. The model assumes the water quality parameters are adequately described by a joint matrix normal distribution. The adopted approach allows for a reduction in redundancies, while emphasizing information richness rather than data richness. The developed approach incorporates the concept of entropy to account for the associated uncertainties. Three different entropy-based criteria are adopted: total system entropy, chlorophyll-a standard violation entropy, and dissolved oxygen standard violation entropy. A multiple attribute decision making framework is adopted to integrate the competing design criteria and to generate a single optimal design. The approach is implemented on the water quality monitoring system of the Neuse River Estuary in North Carolina, USA. The model results indicate that the high priority monitoring areas identified by the total system entropy and the dissolved oxygen violation entropy criteria are largely coincident. The monitoring design based on the chlorophyll-a standard violation entropy proved to be less informative, given the low probabilities of violating the water quality standard in the estuary.

  16. Leveraging Data Analysis for Domain Experts: An Embeddable Framework for Basic Data Science Tasks

    Science.gov (United States)

    Lohrer, Johannes-Y.; Kaltenthaler, Daniel; Kröger, Peer

    2016-01-01

    In this paper, we describe a framework for data analysis that can be embedded into a base application. Since it is important to analyze the data directly inside the application where the data is entered, a tool that allows the scientists to easily work with their data, supports and motivates the execution of further analysis of their data, which…

  17. A Comparative Analysis of PISA Scientific Literacy Framework in Finnish and Thai Science Curricula

    Science.gov (United States)

    Sothayapetch, Pavinee; Lavonen, Jari; Juuti, Kalle

    2013-01-01

    A curriculum is a master plan that regulates teaching and learning. This paper compares Finnish and Thai primary school level science curricula to the PISA 2006 Scientific Literacy Framework. Curriculum comparison was made following the procedure of deductive content analysis. In the analysis, there were four main categories adopted from PISA…

  18. Using a Mixed-Methods RE-AIM Framework to Evaluate Community Health Programs for Older Latinas.

    Science.gov (United States)

    Schwingel, Andiara; Gálvez, Patricia; Linares, Deborah; Sebastião, Emerson

    2017-06-01

    This study used the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework to evaluate a promotora-led community health program designed for Latinas ages 50 and older that sought to improve physical activity, nutrition, and stress management. A mixed-methods evaluation approach was administered at participant and organizational levels with a focus on the efficacy, adoption, implementation, and maintenance components of the RE-AIM theoretical model. The program was shown to be effective at improving participants' eating behaviors, increasing their physical activity levels, and lowering their depressive symptoms. Promotoras felt motivated and sufficiently prepared to deliver the program. Some implementation challenges were reported. More child care opportunities and an increased focus on mental well-being were suggested. The promotora delivery model has promise for program sustainability with both promotoras and participants alike expressing interest in leading future programs.

  19. Energy Analysis Program. 1992 Annual report

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    The Program became deeply involved in establishing 4 Washington, D.C., project office diving the last few months of fiscal year 1942. This project office, which reports to the Energy & Environment Division, will receive the majority of its support from the Energy Analysis Program. We anticipate having two staff scientists and support personnel in offices within a few blocks of DOE. Our expectation is that this office will carry out a series of projects that are better managed closer to DOE. We also anticipate that our representation in Washington will improve and we hope to expand the Program, its activities, and impact, in police-relevant analyses. In spite of the growth that we have achieved, the Program continues to emphasize (1) energy efficiency of buildings, (2) appliance energy efficiency standards, (3) energy demand forecasting, (4) utility policy studies, especially integrated resource planning issues, and (5) international energy studies, with considerate emphasis on developing countries and economies in transition. These continuing interests are reflected in the articles that appear in this report.

  20. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  1. Environmental Education Organizations and Programs in Texas: Identifying Patterns through a Database and Survey Approach for Establishing Frameworks for Assessment and Progress

    Science.gov (United States)

    Lloyd-Strovas, Jenny D.; Arsuffi, Thomas L.

    2016-01-01

    We examined the diversity of environmental education (EE) in Texas, USA, by developing a framework to assess EE organizations and programs at a large scale: the Environmental Education Database of Organizations and Programs (EEDOP). This framework consisted of the following characteristics: organization/visitor demographics, pedagogy/curriculum,…

  2. Toward an Evaluation Framework for Doctoral Education in Social Work: A 10-Year Retrospective of One PhD Program's Assessment Experiences

    Science.gov (United States)

    Bentley, Kia J.

    2013-01-01

    This article presents a framework for evaluation in social work doctoral education and details 10 years of successes and challenges in one PhD program's use of the framework, including planning and implementing specific assessment activities around student learning outcomes and larger program goals. The article argues that a range of…

  3. A Framework for RFID Survivability Requirement Analysis and Specification

    Science.gov (United States)

    Zuo, Yanjun; Pimple, Malvika; Lande, Suhas

    Many industries are becoming dependent on Radio Frequency Identification (RFID) technology for inventory management and asset tracking. The data collected about tagged objects though RFID is used in various high level business operations. The RFID system should hence be highly available, reliable, and dependable and secure. In addition, this system should be able to resist attacks and perform recovery in case of security incidents. Together these requirements give rise to the notion of a survivable RFID system. The main goal of this paper is to analyze and specify the requirements for an RFID system to become survivable. These requirements, if utilized, can assist the system in resisting against devastating attacks and recovering quickly from damages. This paper proposes the techniques and approaches for RFID survivability requirements analysis and specification. From the perspective of system acquisition and engineering, survivability requirement is the important first step in survivability specification, compliance formulation, and proof verification.

  4. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  5. A Generalized Framework for Non-Stationary Extreme Value Analysis

    Science.gov (United States)

    Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.

    2017-12-01

    Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA

  6. Sediment Analysis Using a Structured Programming Approach

    Directory of Open Access Journals (Sweden)

    Daniela Arias-Madrid

    2012-12-01

    Full Text Available This paper presents an algorithm designed for the analysis of a sedimentary sample of unconsolidated material and seeks to identify very quickly the main features that occur in a sediment and thus classify them fast and efficiently. For this purpose, it requires that the weight of each particle size to be entered in the program and using the method of Moments, which is based on four equations representing the mean, standard deviation, skewness and kurtosis, is found the attributes of the sample in few seconds. With the program these calculations are performed in an effective and more accurately way, obtaining also the explanations of the results of the features such as grain size, sorting, symmetry and origin, which helps to improve the study of sediments and in general the study of sedimentary rocks.

  7. Analysis of the Education Program Approval Process: A Program Evaluation.

    Science.gov (United States)

    Fountaine, Charles A.; And Others

    A study of the education program approval process involving the Veterans Administration (VA) and the State Approving Agencies (SAAs) had the following objectives: to describe the present education program approval process; to determine time and costs associated with the education program approval process; to describe the approval process at…

  8. Big data analysis framework for healthcare and social sectors in Korea.

    Science.gov (United States)

    Song, Tae-Min; Ryu, Seewon

    2015-01-01

    We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached.

  9. A statistical framework for differential network analysis from microarray data

    Directory of Open Access Journals (Sweden)

    Datta Somnath

    2010-02-01

    Full Text Available Abstract Background It has been long well known that genes do not act alone; rather groups of genes act in consort during a biological process. Consequently, the expression levels of genes are dependent on each other. Experimental techniques to detect such interacting pairs of genes have been in place for quite some time. With the advent of microarray technology, newer computational techniques to detect such interaction or association between gene expressions are being proposed which lead to an association network. While most microarray analyses look for genes that are differentially expressed, it is of potentially greater significance to identify how entire association network structures change between two or more biological settings, say normal versus diseased cell types. Results We provide a recipe for conducting a differential analysis of networks constructed from microarray data under two experimental settings. At the core of our approach lies a connectivity score that represents the strength of genetic association or interaction between two genes. We use this score to propose formal statistical tests for each of following queries: (i whether the overall modular structures of the two networks are different, (ii whether the connectivity of a particular set of "interesting genes" has changed between the two networks, and (iii whether the connectivity of a given single gene has changed between the two networks. A number of examples of this score is provided. We carried out our method on two types of simulated data: Gaussian networks and networks based on differential equations. We show that, for appropriate choices of the connectivity scores and tuning parameters, our method works well on simulated data. We also analyze a real data set involving normal versus heavy mice and identify an interesting set of genes that may play key roles in obesity. Conclusions Examining changes in network structure can provide valuable information about the

  10. An intersectionality-based policy analysis framework: critical reflections on a methodology for advancing equity.

    Science.gov (United States)

    Hankivsky, Olena; Grace, Daniel; Hunting, Gemma; Giesbrecht, Melissa; Fridkin, Alycia; Rudrum, Sarah; Ferlatte, Olivier; Clark, Natalie

    2014-12-10

    In the field of health, numerous frameworks have emerged that advance understandings of the differential impacts of health policies to produce inclusive and socially just health outcomes. In this paper, we present the development of an important contribution to these efforts - an Intersectionality-Based Policy Analysis (IBPA) Framework. Developed over the course of two years in consultation with key stakeholders and drawing on best and promising practices of other equity-informed approaches, this participatory and iterative IBPA Framework provides guidance and direction for researchers, civil society, public health professionals and policy actors seeking to address the challenges of health inequities across diverse populations. Importantly, we present the application of the IBPA Framework in seven priority health-related policy case studies. The analysis of each case study is focused on explaining how IBPA: 1) provides an innovative structure for critical policy analysis; 2) captures the different dimensions of policy contexts including history, politics, everyday lived experiences, diverse knowledges and intersecting social locations; and 3) generates transformative insights, knowledge, policy solutions and actions that cannot be gleaned from other equity-focused policy frameworks. The aim of this paper is to inspire a range of policy actors to recognize the potential of IBPA to foreground the complex contexts of health and social problems, and ultimately to transform how policy analysis is undertaken.

  11. SAFE: A Sentiment Analysis Framework for E-Learning

    Directory of Open Access Journals (Sweden)

    Francesco Colace

    2014-12-01

    Full Text Available The spread of social networks allows sharing opinions on different aspects of life and daily millions of messages appear on the web. This textual information can be a rich source of data for opinion mining and sentiment analysis: the computational study of opinions, sentiments and emotions expressed in a text. Its main aim is the identification of the agreement or disagreement statements that deal with positive or negative feelings in comments or reviews. In this paper, we investigate the adoption, in the field of the e-learning, of a probabilistic approach based on the Latent Dirichlet Allocation (LDA as Sentiment grabber. By this approach, for a set of documents belonging to a same knowledge domain, a graph, the Mixed Graph of Terms, can be automatically extracted. The paper shows how this graph contains a set of weighted word pairs, which are discriminative for sentiment classification. In this way, the system can detect the feeling of students on some topics and teacher can better tune his/her teaching approach. In fact, the proposed method has been tested on datasets coming from e-learning platforms. A preliminary experimental campaign shows how the proposed approach is effective and satisfactory.

  12. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    Directory of Open Access Journals (Sweden)

    Runzhe Geng

    Full Text Available Best management practices (BMPs for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P index, model simulation techniques (Hydrological Simulation Program-FORTRAN, and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001 decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program

  13. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework

    Science.gov (United States)

    Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.

    2017-12-01

    The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.

  15. A 3-month jump-landing training program: a feasibility study using the RE-AIM framework.

    Science.gov (United States)

    Aerts, Inne; Cumps, Elke; Verhagen, Evert; Mathieu, Niels; Van Schuerbeeck, Sander; Meeusen, Romain

    2013-01-01

    Evaluating the translatability and feasibility of an intervention program has become as important as determining the effectiveness of the intervention. To evaluate the applicability of a 3-month jump-landing training program in basketball players, using the RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) framework. Randomized controlled trial. National and regional basketball teams. Twenty-four teams of the second highest national division and regional basketball divisions in Flanders, Belgium, were randomly assigned (1:1) to a control group and intervention group. A total of 243 athletes (control group = 129, intervention group = 114), ages 15 to 41 years, volunteered. All exercises in the intervention program followed a progressive development, emphasizing lower extremity alignment during jump-landing activities. The results of the process evaluation of the intervention program were based on the 5 dimensions of the RE-AIM framework. The injury incidence density, hazard ratios, and 95% confidence intervals were determined. The participation rate of the total sample was 100% (reach). The hazard ratio was different between the intervention group and the control group (0.40 [95% confidence interval = 0.16, 0.99]; effectiveness). Of the 12 teams in the intervention group, 8 teams (66.7%) agreed to participate in the study (adoption). Eight of the participating coaches (66.7%) felt positively about the intervention program and stated that they had implemented the training sessions of the program as intended (implementation). All coaches except 1 (87.5%) intended to continue the intervention program the next season (maintenance). Compliance of the coaches in this coach-supervised jump-landing training program was high. In addition, the program was effective in preventing lower extremity injuries.

  16. Interactive Programming and Analysis Aids (IPAA)

    Science.gov (United States)

    1978-06-01

    PAGE 2. GOVT ACCESSION NO.J I JBBIITWj ’"" —— ■ "- INTERACTIVE PROGRAMMING AND ANALYSIS AIDS (IPÄA). 7^ 9. PERFORMING ORGANIZATION ...H fll u u u j o^ o f-« w « » m ^ r>» • ff> o fH CM w * «n >II N ec o^ o »^ ew « isik.f.^iiki>.i«.i^coco<s(Oflos>cocoa> coff >0^ovo«oko^3^a

  17. Analysis of an image quality assurance program

    International Nuclear Information System (INIS)

    Goethlin, J.H.; Alders, B.

    1985-01-01

    Reject film analysis before and after the introduction of a quality assurance program showed a 45% decrease in rejected films. The main changes in equipment and routines were: 1. Increased control of film processors and X-ray generators. 2. New film casettes and screens. 3. Decreased number of film sizes. 4. Information to and supervision of radiographing personnel. Savings in costs and increased income from an increased amount of out-patients corresponded to about 4.5% of the total cost of operating and maintaining the department. (orig.)

  18. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  19. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  20. Two Inseparable Facets of Technology Integration Programs: Technology and Theoretical Framework

    Science.gov (United States)

    Demir, Servet

    2011-01-01

    This paper considers the process of program development aiming at technology integration for teachers. For this consideration, the paper focused on an integration program which was recently developed as part of a larger project. The participants of this program were 45 in-service teachers. The program continued four weeks and the conduct of the…

  1. Framework for analysis of solar energy systems in the built environment from an exergy perspective

    OpenAIRE

    Torio, H.; Schmidt, D.

    2010-01-01

    Exergy analysis is a more powerful tool than mere energy analysis for showing the improvement potential of energy systems. Direct use of solar radiation instead of degrading other high quality energy resources found in nature is advantageous. Yet, due to physical inconsistencies present in the exergy analysis framework for assessing direct-solar systems commonly found in literature, high exergy losses arise in the conversion process of solar radiation in direct-solar systems. However, these l...

  2. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). System Sustainment & Readiness Technologies Dept.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineering system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.

  3. A Triple Aim Framework For the Performance Assessment of Disease Management Programs

    NARCIS (Netherlands)

    Verbeek, N.A.; M.G. Franken (Margreet); M.A. Koopmanschap (Marc); M.P.M.H. Rutten-van Mölken (Maureen)

    2015-01-01

    markdownabstractObjectives: A structured and comprehensive assessment of disease management implementations is not straightforward due to the broadness of the interventions and the various evaluation possibilities. The aim of this study was to develop a comprehensive framework for outcome

  4. From fatalism to mitigation: a conceptual framework for mitigating fetal programming of chronic disease by maternal obesity

    OpenAIRE

    Boone-Heinonen, Janne; Messer, Lynne C.; Fortmann, Stephen P.; Wallack, Lawrence; Thornburg, Kent L.

    2015-01-01

    Prenatal development is recognized as a critical period in the etiology of obesity and cardiometabolic disease. Potential strategies to reduce maternal obesity-induced risk later in life have been largely overlooked. In this paper, we first propose a conceptual framework for the role of public health and preventive medicine in mitigating the effects of fetal programming. Second, we review a small but growing body of research (through August 2015) that examines interactive effects of maternal ...

  5. iOS game development : Mobile game development with Swift programming language and SceneKit framework

    OpenAIRE

    Koskenseppä, Juuso

    2016-01-01

    The purpose of the thesis was to create an iOS game that could be deemed complete enough, so it could be published in Apple’s App Store. This meant fulfilling different guide-lines specified by Apple. The project was carried out by using Apple’s new Swift programming language and SceneKit framework, with an intention to see how they work for iOS game development. The immaturity of Swift programming language led to several code rewrites, every time a newer Swift version was released. T...

  6. A metric and frameworks for resilience analysis of engineered and infrastructure systems

    International Nuclear Information System (INIS)

    Francis, Royce; Bekera, Behailu

    2014-01-01

    In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports

  7. Using the Knowledge to Action Framework in practice: a citation analysis and systematic review.

    Science.gov (United States)

    Field, Becky; Booth, Andrew; Ilott, Irene; Gerrish, Kate

    2014-11-23

    Conceptual frameworks are recommended as a way of applying theory to enhance implementation efforts. The Knowledge to Action (KTA) Framework was developed in Canada by Graham and colleagues in the 2000s, following a review of 31 planned action theories. The framework has two components: Knowledge Creation and an Action Cycle, each of which comprises multiple phases. This review sought to answer two questions: 'Is the KTA Framework used in practice? And if so, how?' This study is a citation analysis and systematic review. The index citation for the original paper was identified on three databases-Web of Science, Scopus and Google Scholar-with the facility for citation searching. Limitations of English language and year of publication 2006-June 2013 were set. A taxonomy categorising the continuum of usage was developed. Only studies applying the framework to implementation projects were included. Data were extracted and mapped against each phase of the framework for studies where it was integral to the implementation project. The citation search yielded 1,787 records. A total of 1,057 titles and abstracts were screened. One hundred and forty-six studies described usage to varying degrees, ranging from referenced to integrated. In ten studies, the KTA Framework was integral to the design, delivery and evaluation of the implementation activities. All ten described using the Action Cycle and seven referred to Knowledge Creation. The KTA Framework was enacted in different health care and academic settings with projects targeted at patients, the public, and nursing and allied health professionals. The KTA Framework is being used in practice with varying degrees of completeness. It is frequently cited, with usage ranging from simple attribution via a reference, through informing planning, to making an intellectual contribution. When the framework was integral to knowledge translation, it guided action in idiosyncratic ways and there was theory fidelity. Prevailing wisdom

  8. Disease Management, Case Management, Care Management, and Care Coordination: A Framework and a Brief Manual for Care Programs and Staff.

    Science.gov (United States)

    Ahmed, Osman I

    2016-01-01

    With the changing landscape of health care delivery in the United States since the passage of the Patient Protection and Affordable Care Act in 2010, health care organizations have struggled to keep pace with the evolving paradigm, particularly as it pertains to population health management. New nomenclature emerged to describe components of the new environment, and familiar words were put to use in an entirely different context. This article proposes a working framework for activities performed in case management, disease management, care management, and care coordination. The author offers standard working definitions for some of the most frequently used words in the health care industry with the goal of increasing consistency for their use, especially in the backdrop of the Centers for Medicaid & Medicare Services offering a "chronic case management fee" to primary care providers for managing the sickest, high-cost Medicare patients. Health care organizations performing case management, care management, disease management, and care coordination. Road map for consistency among users, in reporting, comparison, and for success of care management/coordination programs. This article offers a working framework for disease managers, case and care managers, and care coordinators. It suggests standard definitions to use for disease management, case management, care management, and care coordination. Moreover, the use of clear terminology will facilitate comparing, contrasting, and evaluating all care programs and increase consistency. The article can improve understanding of care program components and success factors, estimate program value and effectiveness, heighten awareness of consumer engagement tools, recognize current state and challenges for care programs, understand the role of health information technology solutions in care programs, and use information and knowledge gained to assess and improve care programs to design the "next generation" of programs.

  9. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  10. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    Science.gov (United States)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2011-06-01

    A new stable version ("production version") v5.28.00 of ROOT [1] has been published [2]. It features several major improvements in many areas, most noteworthy data storage performance as well as statistics and graphics features. Some of these improvements have already been predicted in the original publication Antcheva et al. (2009) [3]. This version will be maintained for at least 6 months; new minor revisions ("patch releases") will be published [4] to solve problems reported with this version. New version program summaryProgram title: ROOT Catalogue identifier: AEFA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Lesser Public License v.2.1 No. of lines in distributed program, including test data, etc.: 2 934 693 No. of bytes in distributed program, including test data, etc.: 1009 Distribution format: tar.gz Programming language: C++ Computer: Intel i386, Intel x86-64, Motorola PPC, Sun Sparc, HP PA-RISC Operating system: GNU/Linux, Windows XP/Vista/7, Mac OS X, FreeBSD, OpenBSD, Solaris, HP-UX, AIX Has the code been vectorized or parallelized?: Yes RAM: > 55 Mbytes Classification: 4, 9, 11.9, 14 Catalogue identifier of previous version: AEFA_v1_0 Journal reference of previous version: Comput. Phys. Commun. 180 (2009) 2499 Does the new version supersede the previous version?: Yes Nature of problem: Storage, analysis and visualization of scientific data Solution method: Object store, wide range of analysis algorithms and visualization methods Reasons for new version: Added features and corrections of deficiencies Summary of revisions: The release notes at http://root.cern.ch/root/v528/Version528.news.html give a module-oriented overview of the changes in v5.28.00. Highlights include File format Reading of TTrees has been improved dramatically with respect to CPU time (30%) and notably with respect to disk space. Histograms A

  11. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    Science.gov (United States)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  12. Teaching and Learning Numerical Analysis and Optimization: A Didactic Framework and Applications of Inquiry-Based Learning

    Science.gov (United States)

    Lappas, Pantelis Z.; Kritikos, Manolis N.

    2018-01-01

    The main objective of this paper is to propose a didactic framework for teaching Applied Mathematics in higher education. After describing the structure of the framework, several applications of inquiry-based learning in teaching numerical analysis and optimization are provided to illustrate the potential of the proposed framework. The framework…

  13. Sustainability principles in strategic environmental assessment: A framework for analysis and examples from Italian urban planning

    Energy Technology Data Exchange (ETDEWEB)

    Lamorgese, Lydia, E-mail: lydial@tin.it; Geneletti, Davide, E-mail: davide.geneletti@unitn.it

    2013-09-15

    This paper presents a framework for analysing the degree of consideration of sustainability principles in Strategic environmental assessment (SEA), and demonstrates its application to a sample of SEA of Italian urban plans. The framework is based on Gibson's (2006) sustainability principles, which are linked to a number of guidance criteria and eventually to review questions, resulting from an extensive literature review. A total of 71 questions are included in the framework, which gives particular emphasis to key concepts, such as intragenerational and intergenerational equity. The framework was applied to review the Environmental Report of the urban plans of 15 major Italian cities. The results of this review show that, even if sustainability is commonly considered as a pivotal concept, there is still work to be done in order to effectively integrate sustainability principles into SEA. In particular, most of the attention is given to mitigation and compensation measures, rather than to actual attempts to propose more sustainable planning decisions in the first place. Concerning the proposed framework of analysis, further research is required to clarify equity concerns and particularly to identify suitable indicators for operationalizing the concepts of intra/inter-generational equity in decision-making. -- Highlights: ► A framework was developed in order to evaluate planning against sustainability criteria. ► The framework was applied to analyse how sustainable principles are addressed in 15 Italian SEA reports. ► Over 85% of the reports addressed, to some extent, at least 40% of the framework questions. ► Criteria explicitly linked to intra and inter-generational equity are rarely addressed.

  14. The Soldier-Cyborg Transformation: A Framework for Analysis of Social and Ethical Issues of Future Warfare

    Science.gov (United States)

    1998-05-26

    government agency. STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...UNCLASSIFIED USAWC STRATEGY RESEARCH PROJECT THE SOLDIER- CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE...P) Donald A. Gagliano, M.D. TITLE: THE SOLDIER CYBORG TRANSFORMATION: A FRAMEWORK FOR ANALYSIS OF SOCIAL AND ETHICAL ISSUES OF FUTURE WARFARE

  15. Mississippi Curriculum Framework for Automotive Mechanics (Program CIP: 47.0604--Auto/Automotive Mechanic/Tech). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for automotive mechanics I and II. Presented first are a program description…

  16. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  17. Understanding Universities in Ontario, Canada: An Industry Analysis Using Porter's Five Forces Framework

    Science.gov (United States)

    Pringle, James; Huisman, Jeroen

    2011-01-01

    In analyses of higher education systems, many models and frameworks are based on governance, steering, or coordination models. Although much can be gained by such analyses, we argue that the language used in the present-day policy documents (knowledge economy, competitive position, etc.) calls for an analysis of higher education as an industry. In…

  18. Using a Strategic Planning Tool as a Framework for Case Analysis

    Science.gov (United States)

    Lai, Christine A.; Rivera, Julio C., Jr.

    2006-01-01

    In this article, the authors describe how they use a strategic planning tool known as SWOT as a framework for case analysis, using it to analyze the strengths, weaknesses, opportunities, and threats of a public works project intended to enhance regional economic development in Tempe, Arizona. Students consider the project in light of a variety of…

  19. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe; Dalcin, Lisandro; Collier, Nathan; Calo, Victor M.

    2014-01-01

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation

  20. The SAFE FOODS Risk Analysis Framework suitable for GMOs? A case study

    NARCIS (Netherlands)

    Kuiper, H.A.; Davies, H.V.

    2010-01-01

    This paper describes the current EU regulatory framework for risk analysis of genetically modified (GM) crop cultivation and market introduction of derived food/feed. Furthermore the risk assessment strategies for GM crops and derived food/feed as designed by the European Food Safety Authority

  1. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways...

  2. Cost-effectiveness analysis for the implementation of the EU Water Framework Directive

    NARCIS (Netherlands)

    van Engelen, D.M.; Seidelin, Christian; van der Veeren, Rob; Barton, David N.; Queb, Kabir

    2008-01-01

    The EU Water Framework Directive (WFD) prescribes cost-effectiveness analysis (CEA) as an economic tool for the minimisation of costs when formulating programmes of measures to be implemented in the European river basins by the year 2009. The WFD does not specify, however, which approach to CEA has

  3. Automated Analysis of ARM Binaries using the Low-Level Virtual Machine Compiler Framework

    Science.gov (United States)

    2011-03-01

    Maintenance ABACAS offers a level of flexibility in software development that would be very useful later in the software engineering life cycle. New... Blackjacking : security threats to blackberry devices, PDAs and cell phones in the enterprise. Indianapolis, Indiana, U.S.A.: Wiley Publishing, 2007...AUTOMATED ANALYSIS OF ARM BINARIES USING THE LOW- LEVEL VIRTUAL MACHINE COMPILER FRAMEWORK THESIS Jeffrey B. Scott

  4. Utility green pricing programs: a statistical analysis of program effectiveness

    International Nuclear Information System (INIS)

    Ryan, W.; Scott, O.; Lori, B.; Blair, S.

    2005-01-01

    Utility green pricing programs represent one way in which consumers can voluntarily support the development of renewable energy. The design features and effectiveness of these programs varies considerably. Based on a survey of utility program managers in the United States, this article provides insight into which program features might help maximize both customer participation in green pricing programs and the amount of renewable energy purchased by customers in those programs. We find that program length has a substantial impact on customer participation and purchases; to achieve higher levels of success, utilities will need to remain committed to their product offering for some time. Our findings also suggest that utilities should consider higher renewable energy purchase thresholds for residential customers in order to maximize renewable energy sales. Smaller utilities are found to be more successful than larger utilities, and we find some evidence that providing private benefits to nonresidential participants can enhance success. Interestingly, we find little evidence that the cost of the green pricing product greatly impacts customer participation and renewable energy sales, at least over the narrow range of premiums embedded in our data set, and for the initial set of green power purchasers. (author)

  5. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer....... The knowledge base provides the necessary information/data during the design of the PAT system while the model library generates additional or missing data needed for design and analysis. Optimization of the PAT system design is achieved in terms of product data analysis time and/or cost of monitoring equipment......-aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...

  6. A human-centered framework for innovation in conservation incentive programs.

    Science.gov (United States)

    Sorice, Michael G; Donlan, C Josh

    2015-12-01

    The promise of environmental conservation incentive programs that provide direct payments in exchange for conservation outcomes is that they enhance the value of engaging in stewardship behaviors. An insidious but important concern is that a narrow focus on optimizing payment levels can ultimately suppress program participation and subvert participants' internal motivation to engage in long-term conservation behaviors. Increasing participation and engendering stewardship can be achieved by recognizing that participation is not simply a function of the payment; it is a function of the overall structure and administration of the program. Key to creating innovative and more sustainable programs is fitting them within the existing needs and values of target participants. By focusing on empathy for participants, co-designing program approaches, and learning from the rapid prototyping of program concepts, a human-centered approach to conservation incentive program design enhances the propensity for discovery of novel and innovative solutions to pressing conservation issues.

  7. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  8. A framework for the economic analysis of data collection methods for vital statistics.

    Science.gov (United States)

    Jimenez-Soto, Eliana; Hodge, Andrew; Nguyen, Kim-Huong; Dettrick, Zoe; Lopez, Alan D

    2014-01-01

    Over recent years there has been a strong movement towards the improvement of vital statistics and other types of health data that inform evidence-based policies. Collecting such data is not cost free. To date there is no systematic framework to guide investment decisions on methods of data collection for vital statistics or health information in general. We developed a framework to systematically assess the comparative costs and outcomes/benefits of the various data methods for collecting vital statistics. The proposed framework is four-pronged and utilises two major economic approaches to systematically assess the available data collection methods: cost-effectiveness analysis and efficiency analysis. We built a stylised example of a hypothetical low-income country to perform a simulation exercise in order to illustrate an application of the framework. Using simulated data, the results from the stylised example show that the rankings of the data collection methods are not affected by the use of either cost-effectiveness or efficiency analysis. However, the rankings are affected by how quantities are measured. There have been several calls for global improvements in collecting useable data, including vital statistics, from health information systems to inform public health policies. Ours is the first study that proposes a systematic framework to assist countries undertake an economic evaluation of DCMs. Despite numerous challenges, we demonstrate that a systematic assessment of outputs and costs of DCMs is not only necessary, but also feasible. The proposed framework is general enough to be easily extended to other areas of health information.

  9. ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization

    International Nuclear Information System (INIS)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Couet, O.; Franco, L.; Canal, Ph.; Casadei, D.; Fine, V.

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally

  10. Performance Analysis of Untraceability Protocols for Mobile Agents Using an Adaptable Framework

    OpenAIRE

    LESZCZYNA RAFAL; GORSKI Janusz Kazimierz

    2006-01-01

    Recently we had proposed two untraceability protocols for mobile agents and began investigating their quality. We believe that quality evaluation of security protocols should extend a sole validation of their security and cover other quality aspects, primarily their efficiency. Thus after conducting a security analysis, we wanted to complement it with a performance analysis. For this purpose we developed a performance evaluation framework, which, as we realised, with certain adjustments, can ...

  11. An Integrated Strategy Framework (ISF) for Combining Porter's 5-Forces, Diamond, PESTEL, and SWOT Analysis

    OpenAIRE

    Anton, Roman

    2015-01-01

    INTRODUCTION Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy framework (ISF) combines all major concepts. PURPOSE Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy fr...

  12. Along the way to developing a theory of the program: a re-examination of the conceptual framework as an organizing strategy.

    Science.gov (United States)

    Helitzer, Deborah L; Sussman, Andrew L; Hoffman, Richard M; Getrich, Christina M; Warner, Teddy D; Rhyne, Robert L

    2014-08-01

    Conceptual frameworks (CF) have historically been used to develop program theory. We re-examine the literature about the role of CF in this context, specifically how they can be used to create descriptive and prescriptive theories, as building blocks for a program theory. Using a case example of colorectal cancer screening intervention development, we describe the process of developing our initial CF, the methods used to explore the constructs in the framework and revise the framework for intervention development. We present seven steps that guided the development of our CF: (1) assemble the "right" research team, (2) incorporate existing literature into the emerging CF, (3) construct the conceptual framework, (4) diagram the framework, (5) operationalize the framework: develop the research design and measures, (6) conduct the research, and (7) revise the framework. A revised conceptual framework depicted more complicated inter-relationships of the different predisposing, enabling, reinforcing, and system-based factors. The updated framework led us to generate program theory and serves as the basis for designing future intervention studies and outcome evaluations. A CF can build a foundation for program theory. We provide a set of concrete steps and lessons learned to assist practitioners in developing a CF. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    Science.gov (United States)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  14. Computer programs simplify optical system analysis

    Science.gov (United States)

    1965-01-01

    The optical ray-trace computer program performs geometrical ray tracing. The energy-trace program calculates the relative monochromatic flux density on a specific target area. This program uses the ray-trace program as a subroutine to generate a representation of the optical system.

  15. Supportive supervision and constructive relationships with healthcare workers support CHW performance: Use of a qualitative framework to evaluate CHW programming in Uganda.

    Science.gov (United States)

    Ludwick, Teralynn; Turyakira, Eleanor; Kyomuhangi, Teddy; Manalili, Kimberly; Robinson, Sheila; Brenner, Jennifer L

    2018-02-13

    While evidence supports community health worker (CHW) capacity to improve maternal and newborn health in less-resourced countries, key implementation gaps remain. Tools for assessing CHW performance and evidence on what programmatic components affect performance are lacking. This study developed and tested a qualitative evaluative framework and tool to assess CHW team performance in a district program in rural Uganda. A new assessment framework was developed to collect and analyze qualitative evidence based on CHW perspectives on seven program components associated with effectiveness (selection; training; community embeddedness; peer support; supportive supervision; relationship with other healthcare workers; retention and incentive structures). Focus groups were conducted with four high/medium-performing CHW teams and four low-performing CHW teams selected through random, stratified sampling. Content analysis involved organizing focus group transcripts according to the seven program effectiveness components, and assigning scores to each component per focus group. Four components, 'supportive supervision', 'good relationships with other healthcare workers', 'peer support', and 'retention and incentive structures' received the lowest overall scores. Variances in scores between 'high'/'medium'- and 'low'-performing CHW teams were largest for 'supportive supervision' and 'good relationships with other healthcare workers.' Our analysis suggests that in the Bushenyi intervention context, CHW team performance is highly correlated with the quality of supervision and relationships with other healthcare workers. CHWs identified key performance-related issues of absentee supervisors, referral system challenges, and lack of engagement/respect by health workers. Other less-correlated program components warrant further study and may have been impacted by relatively consistent program implementation within our limited study area. Applying process-oriented measurement tools are

  16. A Stochastic Hybrid Systems framework for analysis of Markov reward models

    International Nuclear Information System (INIS)

    Dhople, S.V.; DeVille, L.; Domínguez-García, A.D.

    2014-01-01

    In this paper, we propose a framework to analyze Markov reward models, which are commonly used in system performability analysis. The framework builds on a set of analytical tools developed for a class of stochastic processes referred to as Stochastic Hybrid Systems (SHS). The state space of an SHS is comprised of: (i) a discrete state that describes the possible configurations/modes that a system can adopt, which includes the nominal (non-faulty) operational mode, but also those operational modes that arise due to component faults, and (ii) a continuous state that describes the reward. Discrete state transitions are stochastic, and governed by transition rates that are (in general) a function of time and the value of the continuous state. The evolution of the continuous state is described by a stochastic differential equation and reward measures are defined as functions of the continuous state. Additionally, each transition is associated with a reset map that defines the mapping between the pre- and post-transition values of the discrete and continuous states; these mappings enable the definition of impulses and losses in the reward. The proposed SHS-based framework unifies the analysis of a variety of previously studied reward models. We illustrate the application of the framework to performability analysis via analytical and numerical examples

  17. Critical analysis of e-health readiness assessment frameworks: suitability for application in developing countries.

    Science.gov (United States)

    Mauco, Kabelo Leonard; Scott, Richard E; Mars, Maurice

    2018-02-01

    Introduction e-Health is an innovative way to make health services more effective and efficient and application is increasing worldwide. e-Health represents a substantial ICT investment and its failure usually results in substantial losses in time, money (including opportunity costs) and effort. Therefore it is important to assess e-health readiness prior to implementation. Several frameworks have been published on e-health readiness assessment, under various circumstances and geographical regions of the world. However, their utility for the developing world is unknown. Methods A literature review and analysis of published e-health readiness assessment frameworks or models was performed to determine if any are appropriate for broad assessment of e-health readiness in the developing world. A total of 13 papers described e-health readiness in different settings. Results and Discussion Eight types of e-health readiness were identified and no paper directly addressed all of these. The frameworks were based upon varying assumptions and perspectives. There was no underlying unifying theory underpinning the frameworks. Few assessed government and societal readiness, and none cultural readiness; all are important in the developing world. While the shortcomings of existing frameworks have been highlighted, most contain aspects that are relevant and can be drawn on when developing a framework and assessment tools for the developing world. What emerged is the need to develop different assessment tools for the various stakeholder sectors. This is an area that needs further research before attempting to develop a more generic framework for the developing world.

  18. PageRank, HITS and a unified framework for link analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst

    2001-10-01

    Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.

  19. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    and approaches which have been developed or proposed by large organizations or regulatory bodies for NM. These frameworks and approaches were evaluated and assessed based on a select number of criteria which have been previously proposed as important parameters for inclusion in successful risk assessment......7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...

  20. Using the framework method for the analysis of qualitative data in multi-disciplinary health research.

    Science.gov (United States)

    Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi

    2013-09-18

    The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.

  1. A unified framework for risk and vulnerability analysis covering both safety and security

    International Nuclear Information System (INIS)

    Aven, Terje

    2007-01-01

    Recently, we have seen several attempts to establish adequate risk and vulnerability analyses tools and related management frameworks dealing not only with accidental events but also security problems. These attempts have been based on different analysis approaches and using alternative building blocks. In this paper, we discuss some of these and show how a unified framework for such analyses and management tasks can be developed. The framework is based on the use of probability as a measure of uncertainty, as seen through the eyes of the assessor, and define risk as the combination of possible consequences and related uncertainties. Risk and vulnerability characterizations are introduced incorporating ideas both from vulnerability analyses literature as well as from the risk classification scheme introduced by Renn and Klinke

  2. Vital analysis: field validation of a framework for annotating biological signals of first responders in action.

    Science.gov (United States)

    Gomes, P; Lopes, B; Coimbra, M

    2012-01-01

    First responders are professionals that are exposed to extreme stress and fatigue during extended periods of time. That is why it is necessary to research and develop technological solutions based on wearable sensors that can continuously monitor the health of these professionals in action, namely their stress and fatigue levels. In this paper we present the Vital Analysis smartphone-based framework, integrated into the broader Vital Responder project, that allows the annotation and contextualization of the signals collected during real action. After a contextual study we have implemented and deployed this framework in a firefighter team with 5 elements, from where we have collected over 3300 hours of annotations during 174 days, covering 382 different events. Results are analysed and discussed, validating the framework as a useful and usable tool for annotating biological signals of first responders in action.

  3. Operationalizing the 21st Century Learning Skills Framework for the NASA Mission to Mars Program

    Science.gov (United States)

    Smith, Burgess; Research, MSI; Evaluation Team; Interactive Videoconferences Teamlt/p>, MSI

    2013-06-01

    Internal evaluators working with the NASA Mission to Mars program, an out-of-school collaborative videoconferencing program at the Museum of Science and Industry Chicago (MSI), developed an observation protocol to collect evidence about the collaborative learning opportunities offered by the program’s unique technology. Details about the protocol’s development are discussed, along with results of the pilot observations of the program.

  4. Shader programming for computational arts and design: A comparison between creative coding frameworks

    OpenAIRE

    Gomez, Andres Felipe; Colubri, Andres; Charalambos, Jean Pierre

    2016-01-01

    We describe an Application Program Interface (API) that facilitates the use of GLSL shaders in computational design, interactive arts, and data visualization. This API was first introduced in the version 2.0 of Processing, a programming language and environment widely used for teaching and production in the context of media arts and design, and has been recently completed in the 3.0 release. It aims to incorporate low-level shading programming into code-based design, by int...

  5. Innovation and entrepreneurship programs in US medical education: a landscape review and thematic analysis.

    Science.gov (United States)

    Niccum, Blake A; Sarker, Arnab; Wolf, Stephen J; Trowbridge, Matthew J

    2017-01-01

    Training in innovation and entrepreneurship (I&E) in medical education has become increasingly prevalent among medical schools to train students in complex problem solving and solution design. We aim to characterize I&E education in US allopathic medical schools to provide insight into the features and objectives of this growing field. I&E programs were identified in 2016 via structured searches of 158 US allopathic medical school websites. Program characteristics were identified from public program resources and structured phone interviews with program directors. Curricular themes were identified via thematic analysis of program resources, and themes referenced by >50% of programs were analyzed. Thirteen programs were identified. Programs had a median age of four years, and contained a median of 13 students. Programs were led by faculty from diverse professional backgrounds, and all awarded formal recognition to graduates. Nine programs spanned all four years of medical school and ten programs required a capstone project. Thematic analysis revealed seven educational themes (innovation, entrepreneurship, technology, leadership, healthcare systems, business of medicine, and enhanced adaptability) and two teaching method themes (active learning, interdisciplinary teaching) referenced by >50% of programs. The landscape of medical school I&E programs is rapidly expanding to address newfound skills needed by physicians due to ongoing changes in healthcare, but programs remain relatively few and small compared to class size. This landscape analysis is the first review of I&E in medical education and may contribute to development of a formal educational framework or competency model for current or future programs. AAMC: American Association of Medical Colleges; AMA: American Medical Association; I&E: Innovation and entrepreneurship.

  6. Multi-Year Program under Budget Constraints Using Multi-Criteria Analysis

    Directory of Open Access Journals (Sweden)

    Surya Adiguna

    2017-09-01

    Full Text Available Road investment appraisal requires joint consideration of multiple criteria which are related to engineering, economic, social and environmental impacts. The investment consideration could be based on the economic analysis but however for some factors, such as environmental, social, and political, are difficult to quantify in monetary term. The multi-criteria analysis is the alternative tool which caters the requirements of the issues above. The research, which is based on 102 class D and class E paved road sections in Kenya, is about to optimize road network investment under budget constraints by applying a multi-criteria analysis (MCA method and compare it with the conventional economic analysis. The MCA is developed from hierarchy structure which is considered as the analytical framework. The framework is based on selected criteria and weights which are assigned from Kenya road policy. The HDM-4 software is applied as decision-making tool to obtain the best investment alternatives and road work programs from both MCA and economic analysis. The road work programs will be the results from the analysis using both MCA and economic analysis within HDM-4 software to see the difference and compare the results between both programs. The results from MCA show 51 road sections need periodic work, which is overlay or resealing. Meanwhile, 51 others need rehabilitation or reconstruction. The five years road work program which based on economic analysis result shows that it costs almost Kenyan Shilling (KES 130 billion to maintain the class D and E paved road in Kenya. Meanwhile, the MCA only requires KES 59.5 billion for 5 years program. These results show huge margin between two analyses and somehow MCA result provides more efficient work program compared to economic analysis.

  7. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach

    International Nuclear Information System (INIS)

    Kontogiannis, Tom

    1997-01-01

    Managing complex industrial systems requires reliable performance of cognitive tasks undertaken by operating crews. The infrequent practice of cognitive skills and the reliance on operator performance for novel situations raised cognitive reliability into an urgent and essential aspect in system design and risk analysis. The aim of this article is to contribute to the development of methods for the analysis of cognitive tasks in complex man-machine interactions. A practical framework is proposed for analysing cognitive errors and enhancing error recovery through interface design. Cognitive errors are viewed as failures in problem solving which are difficult to recover under the task constrains imposed by complex systems. In this sense, the interaction between context and cognition, on the one hand, and the process of error recovery, on the other hand, become the focal points of the proposed framework which is illustrated in an analysis of a simulated emergency

  8. Using an intervention mapping framework to develop an online mental health continuing education program for pharmacy staff.

    Science.gov (United States)

    Wheeler, Amanda; Fowler, Jane; Hattingh, Laetitia

    2013-01-01

    Current mental health policy in Australia recognizes that ongoing mental health workforce development is crucial to mental health care reform. Community pharmacy staff are well placed to assist people with mental illness living in the community; however, staff require the knowledge and skills to do this competently and effectively. This article presents the systematic planning and development process and content of an education and training program for community pharmacy staff, using a program planning approach called intervention mapping. The intervention mapping framework was used to guide development of an online continuing education program. Interviews with mental health consumers and carers (n = 285) and key stakeholders (n = 15), and a survey of pharmacy staff (n = 504) informed the needs assessment. Program objectives were identified specifying required attitudes, knowledge, skills, and confidence. These objectives were aligned with an education technique and delivery strategy. This was followed by development of an education program and comprehensive evaluation plan. The program was piloted face to face with 24 participants and then translated into an online program comprising eight 30-minute modules for pharmacists, 4 of which were also used for support staff. The evaluation plan provided for online participants (n ≅ 500) to be randomized into intervention (immediate access) or control groups (delayed training access). It included pre- and posttraining questionnaires and a reflective learning questionnaire for pharmacy staff and telephone interviews post pharmacy visit for consumers and carers. An online education program was developed to address mental health knowledge, attitudes, confidence, and skills required by pharmacy staff to work effectively with mental health consumers and carers. Intervention mapping provides a systematic and rigorous approach that can be used to develop a quality continuing education program for the health workforce

  9. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P; Volynets, O

    2011-01-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  10. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    Science.gov (United States)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  11. Constraint Solver Techniques for Implementing Precise and Scalable Static Program Analysis

    DEFF Research Database (Denmark)

    Zhang, Ye

    solver using unification we could make a program analysis easier to design and implement, much more scalable, and still as precise as expected. We present an inclusion constraint language with the explicit equality constructs for specifying program analysis problems, and a parameterized framework...... developers to build reliable software systems more quickly and with fewer bugs or security defects. While designing and implementing a program analysis remains a hard work, making it both scalable and precise is even more challenging. In this dissertation, we show that with a general inclusion constraint...... data flow analyses for C language, we demonstrate a large amount of equivalences could be detected by off-line analyses, and they could then be used by a constraint solver to significantly improve the scalability of an analysis without sacrificing any precision....

  12. A linear programming computational framework integrates phosphor-proteomics and prior knowledge to predict drug efficacy.

    Science.gov (United States)

    Ji, Zhiwei; Wang, Bing; Yan, Ke; Dong, Ligang; Meng, Guanmin; Shi, Lei

    2017-12-21

    In recent years, the integration of 'omics' technologies, high performance computation, and mathematical modeling of biological processes marks that the systems biology has started to fundamentally impact the way of approaching drug discovery. The LINCS public data warehouse provides detailed information about cell responses with various genetic and environmental stressors. It can be greatly helpful in developing new drugs and therapeutics, as well as improving the situations of lacking effective drugs, drug resistance and relapse in cancer therapies, etc. In this study, we developed a Ternary status based Integer Linear Programming (TILP) method to infer cell-specific signaling pathway network and predict compounds' treatment efficacy. The novelty of our study is that phosphor-proteomic data and prior knowledge are combined for modeling and optimizing the signaling network. To test the power of our approach, a generic pathway network was constructed for a human breast cancer cell line MCF7; and the TILP model was used to infer MCF7-specific pathways with a set of phosphor-proteomic data collected from ten representative small molecule chemical compounds (most of them were studied in breast cancer treatment). Cross-validation indicated that the MCF7-specific pathway network inferred by TILP were reliable predicting a compound's efficacy. Finally, we applied TILP to re-optimize the inferred cell-specific pathways and predict the outcomes of five small compounds (carmustine, doxorubicin, GW-8510, daunorubicin, and verapamil), which were rarely used in clinic for breast cancer. In the simulation, the proposed approach facilitates us to identify a compound's treatment efficacy qualitatively and quantitatively, and the cross validation analysis indicated good accuracy in predicting effects of five compounds. In summary, the TILP model is useful for discovering new drugs for clinic use, and also elucidating the potential mechanisms of a compound to targets.

  13. Establishing a framework for a physician assistant/bioethics dual degree program.

    Science.gov (United States)

    Carr, Mark F; Bergman, Brett A

    2014-01-01

    : Numerous medical schools currently offer a master of arts (MA) in bioethics dual degree for physicians. A degree in bioethics enhances the care physicians provide to patients and prepares physicians to serve on ethics committees and consult services. Additionally, they may work on institutional and public policy issues related to ethics. Several physician assistant (PA) programs currently offer a master of public health (MPH) dual degree for PAs. A degree in public health prepares PAs for leadership roles in meeting community health needs. With the success of PA/MPH dual degree programs, we argue here that a PA/bioethics dual degree would be another opportunity to advance the PA profession and consider how such a program might be implemented. The article includes the individual perspectives of the authors, one of whom completed a graduate-level certificate in bioethics concurrently with his 2-year PA program, while the other served as a bioethics program director.

  14. Intrasystem Electromagnetic Compatibility Analysis Program. Volume III. Computer Program Documentation

    Science.gov (United States)

    1974-12-01

    Cor intied) PROGRAM NAME SIMBOL DEFINITION FQEPDB fep IN dB FQEPL LOWER INTERVAL BOUNDARY FREQ OF fep FQEPU UPPER INTERVAL BOUNDARY FREQ OF f .4, fep...I• TOR. VARIABLES L __G~~ NM SIMBOL DEFINITION BWFE 1 BANDWIDTH FACTOR OF EM’TR BANDWIDTH FACTOR OF RCPT EINTB INTEGRATBD MARGIN BROAD BAND COMPON

  15. Does the knowledge-to-action (KTA) framework facilitate physical demands analysis development for firefighter injury management and return-to-work planning?

    Science.gov (United States)

    Sinden, Kathryn; MacDermid, Joy C

    2014-03-01

    Employers are tasked with developing injury management and return-to-work (RTW) programs in response to occupational health and safety policies. Physical demands analyses (PDAs) are the cornerstone of injury management and RTW development. Synthesizing and contextualizing policy knowledge for use in occupational program development, including PDAs, is challenging due to multiple stakeholder involvement. Few studies have used a knowledge translation theoretical framework to facilitate policy-based interventions in occupational contexts. The primary aim of this case study was to identify how constructs of the knowledge-to-action (KTA) framework were reflected in employer stakeholder-researcher collaborations during development of a firefighter PDA. Four stakeholder meetings were conducted with employee participants who had experience using PDAs in their occupational role. Directed content analysis informed analyses of meeting minutes, stakeholder views and personal reflections recorded throughout the case. Existing knowledge sources including local data, stakeholder experiences, policies and priorities were synthesized and tailored to develop a PDA in response to the barriers and facilitators identified by the firefighters. The flexibility of the KTA framework and synthesis of multiple knowledge sources were identified strengths. The KTA Action cycle was useful in directing the overall process but insufficient for directing the specific aspects of PDA development. Integration of specific PDA guidelines into the process provided explicit direction on best practices in tailoring the PDA and knowledge synthesis. Although the themes of the KTA framework were confirmed in our analysis, order modification of the KTA components was required. Despite a complex context with divergent perspectives successful implementation of a draft PDA was achieved. The KTA framework facilitated knowledge synthesis and PDA development but specific standards and modifications to the KTA

  16. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    Science.gov (United States)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  17. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  18. Can Programming Frameworks Bring Smartphones into the Mainstream of Psychological Science?

    OpenAIRE

    Piwek, Lukasz; Ellis, David A.

    2016-01-01

    Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key issues that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in li...

  19. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  20. RAMPAC: a Program for Analysis of Complicated Raman Spectra

    NARCIS (Netherlands)

    de Mul, F.F.M.; Greve, Jan

    1993-01-01

    A computer program for the analysis of complicated (e.g. multi-line) Raman spectra is described. The program includes automatic peak search, various procedures for background determination, peak fit and spectrum deconvolution and extensive spectrum handling procedures.

  1. Mississippi Curriculum Framework for Diesel Equipment Repair & Service (Program CIP: 47.0605--Diesel Engine Mechanic & Repairer). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for diesel engine mechanics I and II. Presented first are a program…

  2. Mississippi Curriculum Framework for Marketing and Fashion Merchandising (Program CIP: 08.0705--General Retailing Operations). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for marketing I-II and fashion merchandising. Presented first are a program…

  3. Recent advances in metal-organic frameworks and covalent organic frameworks for sample preparation and chromatographic analysis.

    Science.gov (United States)

    Wang, Xuan; Ye, Nengsheng

    2017-12-01

    In the field of analytical chemistry, sample preparation and chromatographic separation are two core procedures. The means by which to improve the sensitivity, selectivity and detection limit of a method have become a topic of great interest. Recently, porous organic frameworks, such as metal-organic frameworks (MOFs) and covalent organic frameworks (COFs), have been widely used in this research area because of their special features, and different methods have been developed. This review summarizes the applications of MOFs and COFs in sample preparation and chromatographic stationary phases. The MOF- or COF-based solid-phase extraction (SPE), solid-phase microextraction (SPME), gas chromatography (GC), high-performance liquid chromatography (HPLC) and capillary electrochromatography (CEC) methods are described. The excellent properties of MOFs and COFs have resulted in intense interest in exploring their performance and mechanisms for sample preparation and chromatographic separation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Programming Social Applications Building Viral Experiences with OpenSocial, OAuth, OpenID, and Distributed Web Frameworks

    CERN Document Server

    LeBlanc, Jonathan

    2011-01-01

    Social networking has made one thing clear: websites and applications need to provide users with experiences tailored to their preferences. This in-depth guide shows you how to build rich social frameworks, using open source technologies and specifications. You'll learn how to create third-party applications for existing sites, build engaging social graphs, and develop products to host your own socialized experience. Programming Social Apps focuses on the OpenSocial platform, along with Apache Shindig, OAuth, OpenID, and other tools, demonstrating how they work together to help you solve pra

  5. Development of an Artificial Intelligence Programming Course and Unity3d Based Framework to Motivate Learning in Artistic Minded Students

    DEFF Research Database (Denmark)

    Reng, Lars

    2012-01-01

    between technical and artistic minded students is, however, increased once the students reach the sixth semester. The complex algorithms of the artificial intelligence course seemed to demotivate the artistic minded students even before the course began. This paper will present the extensive changes made...... to the sixth semester artificial intelligence programming course, in order to provide a highly motivating direct visual feedback, and thereby remove the steep initial learning curve for artistic minded students. The framework was developed with close dialog to both the game industry and experienced master...

  6. UNC-Utah NA-MIC framework for DTI fiber tract analysis.

    Science.gov (United States)

    Verde, Audrey R; Budin, Francois; Berger, Jean-Baptiste; Gupta, Aditya; Farzinfar, Mahshid; Kaiser, Adrien; Ahn, Mihye; Johnson, Hans; Matsui, Joy; Hazlett, Heather C; Sharma, Anuja; Goodlett, Casey; Shi, Yundi; Gouttard, Sylvain; Vachet, Clement; Piven, Joseph; Zhu, Hongtu; Gerig, Guido; Styner, Martin

    2014-01-01

    Diffusion tensor imaging has become an important modality in the field of neuroimaging to capture changes in micro-organization and to assess white matter integrity or development. While there exists a number of tractography toolsets, these usually lack tools for preprocessing or to analyze diffusion properties along the fiber tracts. Currently, the field is in critical need of a coherent end-to-end toolset for performing an along-fiber tract analysis, accessible to non-technical neuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents a coherent, open source, end-to-end toolset for atlas fiber tract based DTI analysis encompassing DICOM data conversion, quality control, atlas building, fiber tractography, fiber parameterization, and statistical analysis of diffusion properties. Most steps utilize graphical user interfaces (GUI) to simplify interaction and provide an extensive DTI analysis framework for non-technical researchers/investigators. We illustrate the use of our framework on a small sample, cross sectional neuroimaging study of eight healthy 1-year-old children from the Infant Brain Imaging Study (IBIS) Network. In this limited test study, we illustrate the power of our method by quantifying the diffusion properties at 1 year of age on the genu and splenium fiber tracts.

  7. CHEMICAL AND BIOLOGICAL DEFENSE: Program Planning and Evaluation Should Follow Results Act Framework

    National Research Council Canada - National Science Library

    1999-01-01

    As you requested, we examined the extent to which DOD has applied the Results Act's outcome-oriented principles to the CB Defense Program, focusing in particular on research, development, testing, and evaluation (RDT&E...

  8. Work program analysis - defining the capability/risk plan

    International Nuclear Information System (INIS)

    Hrinivich, W.A.

    2004-01-01

    Bruce Power has developed and implemented an analysis methodology (Work Program Analysis) to assess and address corporate business risk associated with work group capability. Work Program Analysis is proving to be an excellent tool for identifying and supporting key business decisions facing the line and senior management at Bruce Power. The following describes the methodology, its application and the results achieved. (author)

  9. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  10. HTC Experimental Program: Validation and Calculational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fernex, F.; Ivanova, T.; Bernard, F.; Letang, E. [Inst Radioprotect and Surete Nucl, F-92262 Fontenay Aux Roses (France); Fouillaud, P. [CEA Valduc, Serv Rech Neutron and Critcite, 21 - Is-sur-Tille (France); Thro, J. F. [AREVA NC, F-78000 Versailles (France)

    2009-05-15

    In the 1980's a series of the Haut Taux de Combustion (HTC) critical experiments with fuel pins in a water-moderated lattice was conducted at the Apparatus B experimental facility in Valduc (Commissariat a I'Energie Atomique, France) with the support of the Institut de Radioprotection et de Surete Nucleaire and AREVA NC. Four series of experiments were designed to assess profit associated with actinide-only burnup credit in the criticality safety evaluation for fuel handling, pool storage, and spent-fuel cask conditions. The HTC rods, specifically fabricated for the experiments, simulated typical pressurized water reactor uranium oxide spent fuel that had an initial enrichment of 4. 5 wt% {sup 235}U and was burned to 37.5 GWd/tonne U. The configurations have been modeled with the CRISTAL criticality package and SCALE 5.1 code system. Sensitivity/uncertainty analysis has been employed to evaluate the HTC experiments and to study their applicability for validation of burnup credit calculations. This paper presents the experimental program, the principal results of the experiment evaluation, and modeling. The HTC data applicability to burnup credit validation is demonstrated with an example of spent-fuel storage models. (authors)

  11. Dissimilar weld failure analysis and development program

    International Nuclear Information System (INIS)

    Holko, K.H.; Li, C.C.

    1982-01-01

    The problem of dissimilar weld cracking and failure is examined. This problem occurs in boiler superheater and reheater sections as well as main steam piping. Typically, a dissimilar weld joins low-alloy steel tubing such as Fe-2-1/4 Cr-1Mo to stainless steel tubing such as 321H and 304H. Cracking and failure occur in the low-alloy steel heat-affected zone very close to the weld interface. The 309 stainless steel filler previously used has been replaced with nickel-base fillers such as Inconel 132, Inconel 182, and Incoweld A. This change has extended the time to cracking and failure, but has not solved the problem. To illustrate and define the problem, the metallography of damaged and failed dissimilar welds is described. Results of mechanical tests of dissimilar welds removed from service are presented, and factors believed to be influential in causing damage and failure are discussed. In addition, the importance of dissimilar weldment service history is demonstrated, and the Dissimilar Weld Failure Analysis and Development Program is described. 15 figures

  12. A generic framework for the description and analysis of energy security in an energy system

    International Nuclear Information System (INIS)

    Hughes, Larry

    2012-01-01

    While many energy security indicators and models have been developed for specific jurisdictions or types of energy, few can be considered sufficiently generic to be applicable to any energy system. This paper presents a framework that attempts to meet this objective by combining the International Energy Agency's definition of energy security with structured systems analysis techniques to create three energy security indicators and a process-flow energy systems model. The framework is applicable to those energy systems which can be described in terms of processes converting or transporting flows of energy to meet the energy–demand flows from downstream processes. Each process affects the environment and is subject to jurisdictional policies. The framework can be employed to capture the evolution of energy security in an energy system by analyzing the results of indicator-specific metrics applied to the energy, demand, and environment flows associated with the system's constituent processes. Energy security policies are treated as flows to processes and classified into one of three actions affecting the process's energy demand or the process or its energy input, or both; the outcome is determined by monitoring changes to the indicators. The paper includes a detailed example of an application of the framework. - Highlights: ► The IEA's definition of energy security is parsed into three energy security indicators: availability, affordability, and acceptability. ► Data flow diagrams and other systems analysis tools can represent an energy system and its processes, flows, and chains. ► Indicator-specific metrics applied to a process's flow determine the state of energy security in an energy system, an energy chain, or process. ► Energy policy is considered as a flow and policy outcomes are obtained by measuring flows with indicator-specific metrics. ► The framework is applicable to most jurisdictions and energy types.

  13. Solving nonlinear, High-order partial differential equations using a high-performance isogeometric analysis framework

    KAUST Repository

    Cortes, Adriano Mauricio; Vignal, Philippe; Sarmiento, Adel; Garcí a, Daniel O.; Collier, Nathan; Dalcin, Lisandro; Calo, Victor M.

    2014-01-01

    In this paper we present PetIGA, a high-performance implementation of Isogeometric Analysis built on top of PETSc. We show its use in solving nonlinear and time-dependent problems, such as phase-field models, by taking advantage of the high-continuity of the basis functions granted by the isogeometric framework. In this work, we focus on the Cahn-Hilliard equation and the phase-field crystal equation.

  14. A framework for adaptive e-learning for continuum mechanics and structural analysis

    OpenAIRE

    Mosquera Feijoo, Juan Carlos; Plaza Beltrán, Luis Francisco; González Rodrigo, Beatriz

    2015-01-01

    This paper presents a project for providing the students of Structural Engineering with the flexibility to learn outside classroom schedules. The goal is a framework for adaptive E-learning based on a repository of open educational courseware with a set of basic Structural Engineering concepts and fundamentals. These are paramount for students to expand their technical knowledge and skills in structural analysis and design of tall buildings, arch-type structures as well as bridges. Thus, conc...

  15. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe

    2014-06-06

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation, and the phase-field crystal equation as test cases. These two models allow us to highlight some of the main advantages that we have access to while using PetIGA for scientific computing.

  16. The Mini-Grid Framework: Application Programming Support for Ad hoc Volunteer Grids

    DEFF Research Database (Denmark)

    Venkataraman, Neela Narayanan

    2013-01-01

    To harvest idle, unused computational resources in networked environments, researchers have proposed different architectures for desktop grid infrastructure. However, most of the existing research work focus on centralized approach. In this thesis, we present the development and deployment of one......, and the performance of the framework in a real grid environment. The main contribution of this thesis are: i) modeling entities such as resources and applications using their context, ii) the context-based auction strategy for dynamic task distribution, iii) scheduling through application specific quality parameters...

  17. Integrated predictive maintenance program vibration and lube oil analysis: Part I - history and the vibration program

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, H.

    1996-12-01

    This paper is the first of two papers which describe the Predictive Maintenance Program for rotating machines at the Palo Verde Nuclear Generating Station. The organization has recently been restructured and significant benefits have been realized by the interaction, or {open_quotes}synergy{close_quotes} between the Vibration Program and the Lube Oil Analysis Program. This paper starts with the oldest part of the program - the Vibration Program and discusses the evolution of the program to its current state. The {open_quotes}Vibration{close_quotes} view of the combined program is then presented.

  18. Integrated predictive maintenance program vibration and lube oil analysis: Part I - history and the vibration program

    International Nuclear Information System (INIS)

    Maxwell, H.

    1996-01-01

    This paper is the first of two papers which describe the Predictive Maintenance Program for rotating machines at the Palo Verde Nuclear Generating Station. The organization has recently been restructured and significant benefits have been realized by the interaction, or open-quotes synergyclose quotes between the Vibration Program and the Lube Oil Analysis Program. This paper starts with the oldest part of the program - the Vibration Program and discusses the evolution of the program to its current state. The open-quotes Vibrationclose quotes view of the combined program is then presented

  19. A framework of analysis for field experiments with alternative materials in road construction.

    Science.gov (United States)

    François, D; Jullien, A

    2009-01-01

    In France, a wide variety of alternative materials is produced or exists in the form of stockpiles built up over time. Such materials are distributed over various regions of the territory depending on local industrial development and urbanisation trends. The use of alternative materials at a national scale implies sharing local knowledge and experience. Building a national database on alternative materials for road construction is useful in gathering and sharing information. An analysis of feedback from onsite experiences (back analysis) is essential to improve knowledge on alternative material use in road construction. Back analysis of field studies has to be conducted in accordance with a single common framework. This could enable drawing comparisons between alternative materials and between road applications. A framework for the identification and classification of data used in back analyses is proposed. Since the road structure is an open system, this framework has been based on a stress-response approach at both the material and structural levels and includes a description of external factors applying during the road service life. The proposal has been shaped from a review of the essential characteristics of road materials and structures, as well as from the state of knowledge specific to alternative material characterisation.

  20. UNC-Utah NA-MIC Framework for DTI Fiber Tract Analysis

    Directory of Open Access Journals (Sweden)

    Audrey Rose Verde

    2014-01-01

    Full Text Available Diffusion tensor imaging has become an important modality in the field ofneuroimaging to capture changes in micro-organization and to assess white matterintegrity or development. While there exists a number of tractography toolsets,these usually lack tools for preprocessing or to analyze diffusion properties alongthe fiber tracts. Currently, the field is in critical need of a coherent end-to-endtoolset for performing an along-fiber tract analysis, accessible to non-technicalneuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents acoherent, open source, end-to-end toolset for atlas fiber tract based DTI analysisencompassing DICOM data conversion, quality control, atlas building, fibertractography, fiber parameterization, and statistical analysis of diffusionproperties. Most steps utilize graphical user interfaces (GUI to simplifyinteraction and provide an extensive DTI analysis framework for non-technicalresearchers/investigators. We illustrate the use of our framework on a smallsample, cross sectional neuroimaging study of 8 healthy 1-year-old children fromthe Infant Brain Imaging Study (IBIS Network. In this limited test study, weillustrate the power of our method by quantifying the diffusion properties at 1year of age on the genu and splenium fiber tracts.

  1. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. A Novel Framework for Interactive Visualization and Analysis of Hyperspectral Image Data

    Directory of Open Access Journals (Sweden)

    Johannes Jordan

    2016-01-01

    Full Text Available Multispectral and hyperspectral images are well established in various fields of application like remote sensing, astronomy, and microscopic spectroscopy. In recent years, the availability of new sensor designs, more powerful processors, and high-capacity storage further opened this imaging modality to a wider array of applications like medical diagnosis, agriculture, and cultural heritage. This necessitates new tools that allow general analysis of the image data and are intuitive to users who are new to hyperspectral imaging. We introduce a novel framework that bundles new interactive visualization techniques with powerful algorithms and is accessible through an efficient and intuitive graphical user interface. We visualize the spectral distribution of an image via parallel coordinates with a strong link to traditional visualization techniques, enabling new paradigms in hyperspectral image analysis that focus on interactive raw data exploration. We combine novel methods for supervised segmentation, global clustering, and nonlinear false-color coding to assist in the visual inspection. Our framework coined Gerbil is open source and highly modular, building on established methods and being easily extensible for application-specific needs. It satisfies the need for a general, consistent software framework that tightly integrates analysis algorithms with an intuitive, modern interface to the raw image data and algorithmic results. Gerbil finds its worldwide use in academia and industry alike with several thousand downloads originating from 45 countries.

  3. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  4. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  5. A decision analysis framework to support long-term planning for nuclear fuel cycle technology research, development, demonstration and deployment

    International Nuclear Information System (INIS)

    Sowder, A.G.; Machiels, A.J.; Dykes, A.A.; Johnson, D.H.

    2013-01-01

    To address challenges and gaps in nuclear fuel cycle option assessment and to support research, develop and demonstration programs oriented toward commercial deployment, EPRI (Electric Power Research Institute) is seeking to develop and maintain an independent analysis and assessment capability by building a suite of assessment tools based on a platform of software, simplified relationships, and explicit decision-making and evaluation guidelines. As a demonstration of the decision-support framework, EPRI examines a relatively near-term fuel cycle option, i.e., use of reactor-grade mixed-oxide fuel (MOX) in U.S. light water reactors. The results appear as a list of significant concerns (like cooling of spent fuels, criticality risk...) that have to be taken into account for the final decision

  6. Crisis Reliability Indicators Supporting Emergency Services (CRISES): A Framework for Developing Performance Measures for Behavioral Health Crisis and Psychiatric Emergency Programs.

    Science.gov (United States)

    Balfour, Margaret E; Tanner, Kathleen; Jurica, Paul J; Rhoads, Richard; Carson, Chris A

    2016-01-01

    Crisis and emergency psychiatric services are an integral part of the healthcare system, yet there are no standardized measures for programs providing these services. We developed the Crisis Reliability Indicators Supporting Emergency Services (CRISES) framework to create measures that inform internal performance improvement initiatives and allow comparison across programs. The framework consists of two components-the CRISES domains (timely, safe, accessible, least-restrictive, effective, consumer/family centered, and partnership) and the measures supporting each domain. The CRISES framework provides a foundation for development of standardized measures for the crisis field. This will become increasingly important as pay-for-performance initiatives expand with healthcare reform.

  7. Development of an Analysis and Design Optimization Framework for Marine Propellers

    Science.gov (United States)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  8. The Policy Formation Process: A Conceptual Framework for Analysis. Ph.D. Thesis

    Science.gov (United States)

    Fuchs, E. F.

    1972-01-01

    A conceptual framework for analysis which is intended to assist both the policy analyst and the policy researcher in their empirical investigations into policy phenomena is developed. It is meant to facilitate understanding of the policy formation process by focusing attention on the basic forces shaping the main features of policy formation as a dynamic social-political-organizational process. The primary contribution of the framework lies in its capability to suggest useful ways of looking at policy formation reality. It provides the analyst and the researcher with a group of indicators which suggest where to look and what to look for when attempting to analyze and understand the mix of forces which energize, maintain, and direct the operation of strategic level policy systems. The framework also highlights interconnections, linkage, and relational patterns between and among important variables. The framework offers an integrated set of conceptual tools which facilitate understanding of and research on the complex and dynamic set of variables which interact in any major strategic level policy formation process.

  9. Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.

    Science.gov (United States)

    Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark

    2017-12-01

    A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Validation of a Framework for Measuring Hospital Disaster Resilience Using Factor Analysis

    Directory of Open Access Journals (Sweden)

    Shuang Zhong

    2014-06-01

    Full Text Available Hospital disaster resilience can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services, and then to recover to its original state or adapt to a new one.” This article aims to provide a framework which can be used to comprehensively measure hospital disaster resilience. An evaluation framework for assessing hospital resilience was initially proposed through a systematic literature review and Modified-Delphi consultation. Eight key domains were identified: hospital safety, command, communication and cooperation system, disaster plan, resource stockpile, staff capability, disaster training and drills, emergency services and surge capability, and recovery and adaptation. The data for this study were collected from 41 tertiary hospitals in Shandong Province in China, using a specially designed questionnaire. Factor analysis was conducted to determine the underpinning structure of the framework. It identified a four-factor structure of hospital resilience, namely, emergency medical response capability (F1, disaster management mechanisms (F2, hospital infrastructural safety (F3, and disaster resources (F4. These factors displayed good internal consistency. The overall level of hospital disaster resilience (F was calculated using the scoring model: F = 0.615F1 + 0.202F2 + 0.103F3 + 0.080F4. This validated framework provides a new way to operationalise the concept of hospital resilience, and it is also a foundation for the further development of the measurement instrument in future studies.

  11. A framework for understanding outcomes of integrated care programs for the hospitalized elderly

    Directory of Open Access Journals (Sweden)

    Jacqueline M. Hartgerink

    2013-11-01

    Full Text Available Introduction: Integrated care has emerged as a new strategy to enhance the quality of care for hospitalised elderly. Current models do not provide insight into the mechanisms underlying integrated care delivery. Therefore, we developed a framework to identify the underlying mechanisms of integrated care delivery. We should understand how they operate and interact, so that integrated care programmes can enhance the quality of care and eventually patient outcomes.Theory and methods: Interprofessional collaboration among professionals is considered to be critical in integrated care delivery due to many interdependent work requirements. A review of integrated care components brings to light a distinction between the cognitive and behavioural components of interprofessional collaboration.Results: Effective integrated care programmes combine the interacting components of care delivery. These components affect professionals’ cognitions and behaviour, which in turn affect quality of care. Insight is gained into how these components alter the way care is delivered through mechanisms such as combining individual knowledge and actively seeking new information.Conclusion: We expect that insight into the cognitive and behavioural mechanisms will contribute to the understanding of integrated care programmes. The framework can be used to identify the underlying mechanisms of integrated care responsible for producing favourable outcomes, allowing comparisons across programmes.

  12. Comparing, optimizing, and benchmarking quantum-control algorithms in a unifying programming framework

    International Nuclear Information System (INIS)

    Machnes, S.; Sander, U.; Glaser, S. J.; Schulte-Herbrueggen, T.; Fouquieres, P. de; Gruslys, A.; Schirmer, S.

    2011-01-01

    For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions are pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.

  13. Prevention validation and accounting platform: a framework for establishing accountability and performance measures of substance abuse prevention programs.

    Science.gov (United States)

    Kim, S; McLeod, J H; Williams, C; Hepler, N

    2000-01-01

    The field of substance abuse prevention has neither an overarching conceptual framework nor a set of shared terminologies for establishing the accountability and performance outcome measures of substance abuse prevention services rendered. Hence, there is a wide gap between what we currently have as data on one hand and information that are required to meet the performance goals and accountability measures set by the Government Performance and Results Act of 1993 on the other. The task before us is: How can we establish the accountability and performance measures of substance abuse prevention programs and transform the field of prevention into prevention science? The intent of this volume is to serve that purpose and accelerate the processes of this transformation by identifying the requisite components of the transformation (i.e., theory, methodology, convention on terms, and data) and by introducing an open forum called, Prevention Validation and Accounting (PREVA) Platform. The entire PREVA Platform (for short, the Platform) is designed as an analytic framework, which is formulated by a collectivity of common concepts, terminologies, accounting units, protocols for counting the units, data elements, and operationalizations of various constructs, and other summary measures intended to bring about an efficient and effective measurement of process input, program capacity, process output, performance outcome, and societal impact of substance abuse prevention programs. The measurement units and summary data elements are designed to be measured across time and across jurisdictions, i.e., from local to regional to state to national levels. In the Platform, the process input is captured by two dimensions of time and capital. Time is conceptualized in terms of service delivery time and time spent for research and development. Capital is measured by the monies expended for the delivery of program activities during a fiscal or reporting period. Program capacity is captured

  14. A framework for 2-stage global sensitivity analysis of GastroPlus™ compartmental models.

    Science.gov (United States)

    Scherholz, Megerle L; Forder, James; Androulakis, Ioannis P

    2018-04-01

    Parameter sensitivity and uncertainty analysis for physiologically based pharmacokinetic (PBPK) models are becoming an important consideration for regulatory submissions, requiring further evaluation to establish the need for global sensitivity analysis. To demonstrate the benefits of an extensive analysis, global sensitivity was implemented for the GastroPlus™ model, a well-known commercially available platform, using four example drugs: acetaminophen, risperidone, atenolol, and furosemide. The capabilities of GastroPlus were expanded by developing an integrated framework to automate the GastroPlus graphical user interface with AutoIt and for execution of the sensitivity analysis in MATLAB ® . Global sensitivity analysis was performed in two stages using the Morris method to screen over 50 parameters for significant factors followed by quantitative assessment of variability using Sobol's sensitivity analysis. The 2-staged approach significantly reduced computational cost for the larger model without sacrificing interpretation of model behavior, showing that the sensitivity results were well aligned with the biopharmaceutical classification system. Both methods detected nonlinearities and parameter interactions that would have otherwise been missed by local approaches. Future work includes further exploration of how the input domain influences the calculated global sensitivity measures as well as extending the framework to consider a whole-body PBPK model.

  15. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  16. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  17. Evaluating health inequity interventions: applying a contextual (external) validity framework to programs funded by the Canadian Health Services Research Foundation.

    Science.gov (United States)

    Phillips, Kaye; Müller-Clemm, Werner; Ysselstein, Margaretha; Sachs, Jonathan

    2013-02-01

    Including context in the measurement and evaluation of health in equity interventions is critical to understanding how events that occur in an intervention's environment might contribute to or impede its success. This study adapted and piloted a contextual validity assessment framework on a selection of health inequity-related programs funded by the Canadian Health Services Research Foundation (CHSRF) between 1998 and 2006. The two overarching objectives of this study were (1) to determine the relative amount and quality of attention given to conceptualizing, measuring and validating context within CHSRF funded research final reports related to health-inequity; and (2) to contribute evaluative evidence towards the incorporation of context into the assessment and measurement of health inequity interventions. The study found that of the 42/146 CHSRF programs and projects, judged to be related to health inequity 20 adequately reported on the conceptualization, measurement and validation of context. Amongst these health-inequity related project reports, greatest emphasis was placed on describing the socio-political and economical context over actually measuring and validating contextual evidence. Applying a contextual validity assessment framework was useful for distinguishing between the descriptive (conceptual) versus empirical (measurement and validation) inclusion of documented contextual evidence. Although contextual validity measurement frameworks needs further development, this study contributes insight into identifying funded research related to health inequities and preliminary criteria for assessing interventions targeted at specific populations and jurisdictions. This study also feeds a larger critical dialogue (albeit beyond the scope of this study) regarding the relevance and utility of using evaluative techniques for understanding how specific external conditions support or impede the successful implementation of health inequity interventions. Copyright

  18. Capacity building program: Framework of Standards to secure and facilitate Global Trade

    Energy Technology Data Exchange (ETDEWEB)

    Koech, H K [Program Manager CBP/DHS Office Number 363-6109 Cell Number 0722-774-912, Office Location: Ground Floor U.S. Embassy Nairobi (Kenya)

    2010-07-01

    Effective implementation of capacity building program in Kenya will result in maximum protection against terrorist activity/counter terrorism worldwide due to countries meeting the requirements of the program via safety and security measures at land borders, seaports, and airports. It will also result in enforcement of illegal trade pertaining to terrorist financing, money laundering, trade fraud, strategic cases including weapons of mass destruction, child pornography, intellectual property rights, document fraud, alien smuggling, drug smuggling, and general smuggling. It will also facilitate legitimate commerce.

  19. Capacity building program: Framework of Standards to secure and facilitate Global Trade

    International Nuclear Information System (INIS)

    Koech, H.K.

    2010-01-01

    Effective implementation of capacity building program in Kenya will result in maximum protection against terrorist activity/counter terrorism worldwide due to countries meeting the requirements of the program via safety and security measures at land borders, seaports, and airports. It will also result in enforcement of illegal trade pertaining to terrorist financing, money laundering, trade fraud, strategic cases including weapons of mass destruction, child pornography, intellectual property rights, document fraud, alien smuggling, drug smuggling, and general smuggling. It will also facilitate legitimate commerce.

  20. Control-rod parametrical studies in the framework of the PRE-RACINE and RACINE programs

    International Nuclear Information System (INIS)

    Humbert, G.; Ruelle, B.; Daguzan, G.; Stanculescu, A.; Kappler, F.; Scholtyssek, W.; Bouscavet, D.; Martini, M.; Broccoli, U.

    1982-01-01

    A control-rod experimental program is presented. This program, established in the frame of PRE-RACINE and RACINE common DEBENE, Italian and French experiments at MASURCA facility, is still under progress at the moment. The results, limited to single central rod worth are already available. For these experiments, a parametrical approach has been used. The effects of rod worth, varied separatly by rod side, boron enrichment and core size, on experiment to calculation relative discrepancy (E-C)/C can be drawn out