WorldWideScience

Sample records for program analysis framework

  1. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2010-01-01

    Get a thorough introduction to ADO.NET Entity Framework 4 -- Microsoft's core framework for modeling and interacting with data in .NET applications. The second edition of this acclaimed guide provides a hands-on tour of the framework latest version in Visual Studio 2010 and .NET Framework 4. Not only will you learn how to use EF4 in a variety of applications, you'll also gain a deep understanding of its architecture and APIs. Written by Julia Lerman, the leading independent authority on the framework, Programming Entity Framework covers it all -- from the Entity Data Model and Object Service

  2. Programming Entity Framework

    CERN Document Server

    Lerman, Julia

    2009-01-01

    Programming Entity Framework is a thorough introduction to Microsoft's new core framework for modeling and interacting with data in .NET applications. This highly-acclaimed book not only gives experienced developers a hands-on tour of the Entity Framework and explains its use in a variety of applications, it also provides a deep understanding of its architecture and APIs -- knowledge that will be extremely valuable as you shift to the Entity Framework version in .NET Framework 4.0 and Visual Studio 2010. From the Entity Data Model (EDM) and Object Services to EntityClient and the Metadata Work

  3. Priority-queue framework: Programs

    DEFF Research Database (Denmark)

    Katajainen, Jyrki

    2009-01-01

    This is an electronic appendix to the article "Generic-programming framework for benchmarking weak queues and its relatives". The report contains the programs related to our priority-queue framework. Look at the CPH STL reports 2009-3 and 2009-4 to see examples of other component frameworks....

  4. An evaluation framework and comparative analysis of the widely used first programming languages.

    Science.gov (United States)

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  5. An evaluation framework and comparative analysis of the widely used first programming languages.

    Directory of Open Access Journals (Sweden)

    Muhammad Shoaib Farooq

    Full Text Available Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL. The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  6. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  7. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends.

    Science.gov (United States)

    Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  8. Applications of the MapReduce programming framework to clinical big data analysis: current landscape and future trends

    Science.gov (United States)

    2014-01-01

    The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by

  9. DEFENSE PROGRAMS RISK MANAGEMENT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Constantin PREDA

    2012-01-01

    Full Text Available For the past years defense programs have faced delays in delivering defense capabilities and budget overruns. Stakeholders are looking for ways to improve program management and the decision making process given the very fluid and uncertain economic and political environment. Consequently, they have increasingly resorted to risk management as the main management tool for achieving defense programs objectives and for delivering the defense capabilities strongly needed for the soldiers on the ground on time and within limited defense budgets. Following a risk management based decision-making approach the stakeholders are expected not only to protect program objectives against a wide range of risks but, at the same time, to take advantage of the opportunities to increase the likelihood of program success. The prerequisite for making risk management the main tool for achieving defense programs objectives is the design and implementation of a strong risk management framework as a foundation providing an efficient and effective application of the best risk management practices. The aim of this paper is to examine the risk management framework for defense programs based on the ISO 31000:2009 standard, best risk management practices and the defense programs’ needs and particularities. For the purposes of this article, the term of defense programs refers to joint defense programs.

  10. A Framework for Designing Training Programs to Foster Self-Regulated Learning and Text Analysis Skills

    Directory of Open Access Journals (Sweden)

    Daniela Wagner

    2014-01-01

    Full Text Available The study’s aim was to develop an intervention program and to evaluate its contribution to students’ self-regulated learning (SRL and text analysis skills. In a student-focused training approach, the students themselves acquired the training strategies, whereas in the teacher-focused training, the teachers were enabled to explicitly impart these strategies to their students. In order to investigate the effectiveness of the intervention in terms of transfer benefits on SRL and text analysis skills, 274 lower secondary students were examined in a pretest-training-posttest design. Based on two different training approaches, a distinction was made between four groups: student training (singleST, teacher training (singleTT, combination of student and teacher training (ComT, and control group (CG. Substantially more transfer was revealed in all training conditions as compared to the control group. Specifically, the singleST group showed the highest learning gains for all variables. Conversely, a combination of both approaches (ComT did not result in synergetic effects, but rather in reciprocal interferences.

  11. PCC Framework for Program-Generators

    Science.gov (United States)

    Kong, Soonho; Choi, Wontae; Yi, Kwangkeun

    2009-01-01

    In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.

  12. A PROOF Analysis Framework

    CERN Document Server

    Gonzalez Caballero, Isidro

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of ...

  13. Replicating MISTERS: an epidemiological criminology framework analysis of a program for criminal justice-involved minority males in the community.

    Science.gov (United States)

    Potter, Roberto Hugh; Akers, Timothy A; Bowman, Daniel Richard

    2013-01-01

    The Men in STD Training and Empowerment Research Study (MISTERS) program and epidemiological criminology began their development in Atlanta at about the same time. MISTERS focuses on men recently released from jail to reduce both HIV/STD and crime-related risk factors through a brief educational intervention. This article examines ways in which MISTERS and epidemiological criminology have been used to inform one another in the replication of the MISTERS program in Orange County, Florida. Data from 110 MISTERS participants during the first 10 months of operation are analyzed to examine the overlapping occurrence of health and criminal risk behaviors in the men's lives. This provides a test of core hypotheses from the epidemiological criminology framework. This article also examines application of the epidemiological criminology framework to develop interventions to address health and crime risk factors simultaneously in Criminal Justice-Involved populations in the community.

  14. PROOF Analysis Framework (PAF)

    Science.gov (United States)

    Delgado Fernández, J.; Fernández del Castillo, E.; González Caballero, I.; Rodríguez Marrero, A.

    2015-12-01

    The PROOF Analysis Framework (PAF) has been designed to improve the ability of the physicist to develop software for the final stages of an analysis where typically simple ROOT Trees are used and where the amount of data used is in the order of several terabytes. It hides the technicalities of dealing with PROOF leaving the scientist to concentrate on the analysis. PAF is capable of using available non specific resources on, for example, local batch systems, remote grid sites or clouds through the integration of other toolkit like PROOF Cluster or PoD. While it has been successfully used on LHC Run-1 data for some key analysis, including the H →WW dilepton channel, the higher instantaneous and integrated luminosity together with the increase of the center-of-mass energy foreseen for the LHC Run-2, which will increment the total size of the samples by a factor 6 to 20, will demand PAF to improve its scalability and to reduce the latencies as much as possible. In this paper we address the possible problems of processing such big data volumes with PAF and the solutions implemented to overcome them. We will also show the improvements in order to make PAF more modular and accessible to other communities.

  15. Measuring the Performance of Vaccination Programs Using Cross-Sectional Surveys: A Likelihood Framework and Retrospective Analysis

    Science.gov (United States)

    Lessler, Justin; Metcalf, C. Jessica E.; Grais, Rebecca F.; Luquero, Francisco J.; Cummings, Derek A. T.; Grenfell, Bryan T.

    2011-01-01

    Background The performance of routine and supplemental immunization activities is usually measured by the administrative method: dividing the number of doses distributed by the size of the target population. This method leads to coverage estimates that are sometimes impossible (e.g., vaccination of 102% of the target population), and are generally inconsistent with the proportion found to be vaccinated in Demographic and Health Surveys (DHS). We describe a method that estimates the fraction of the population accessible to vaccination activities, as well as within-campaign inefficiencies, thus providing a consistent estimate of vaccination coverage. Methods and Findings We developed a likelihood framework for estimating the effective coverage of vaccination programs using cross-sectional surveys of vaccine coverage combined with administrative data. We applied our method to measles vaccination in three African countries: Ghana, Madagascar, and Sierra Leone, using data from each country's most recent DHS survey and administrative coverage data reported to the World Health Organization. We estimate that 93% (95% CI: 91, 94) of the population in Ghana was ever covered by any measles vaccination activity, 77% (95% CI: 78, 81) in Madagascar, and 69% (95% CI: 67, 70) in Sierra Leone. “Within-activity” inefficiencies were estimated to be low in Ghana, and higher in Sierra Leone and Madagascar. Our model successfully fits age-specific vaccination coverage levels seen in DHS data, which differ markedly from those predicted by naïve extrapolation from country-reported and World Health Organization–adjusted vaccination coverage. Conclusions Combining administrative data with survey data substantially improves estimates of vaccination coverage. Estimates of the inefficiency of past vaccination activities and the proportion not covered by any activity allow us to more accurately predict the results of future activities and provide insight into the ways in which

  16. Measuring the performance of vaccination programs using cross-sectional surveys: a likelihood framework and retrospective analysis.

    Science.gov (United States)

    Lessler, Justin; Metcalf, C Jessica E; Grais, Rebecca F; Luquero, Francisco J; Cummings, Derek A T; Grenfell, Bryan T

    2011-10-01

    The performance of routine and supplemental immunization activities is usually measured by the administrative method: dividing the number of doses distributed by the size of the target population. This method leads to coverage estimates that are sometimes impossible (e.g., vaccination of 102% of the target population), and are generally inconsistent with the proportion found to be vaccinated in Demographic and Health Surveys (DHS). We describe a method that estimates the fraction of the population accessible to vaccination activities, as well as within-campaign inefficiencies, thus providing a consistent estimate of vaccination coverage. We developed a likelihood framework for estimating the effective coverage of vaccination programs using cross-sectional surveys of vaccine coverage combined with administrative data. We applied our method to measles vaccination in three African countries: Ghana, Madagascar, and Sierra Leone, using data from each country's most recent DHS survey and administrative coverage data reported to the World Health Organization. We estimate that 93% (95% CI: 91, 94) of the population in Ghana was ever covered by any measles vaccination activity, 77% (95% CI: 78, 81) in Madagascar, and 69% (95% CI: 67, 70) in Sierra Leone. "Within-activity" inefficiencies were estimated to be low in Ghana, and higher in Sierra Leone and Madagascar. Our model successfully fits age-specific vaccination coverage levels seen in DHS data, which differ markedly from those predicted by naïve extrapolation from country-reported and World Health Organization-adjusted vaccination coverage. Combining administrative data with survey data substantially improves estimates of vaccination coverage. Estimates of the inefficiency of past vaccination activities and the proportion not covered by any activity allow us to more accurately predict the results of future activities and provide insight into the ways in which vaccination programs are failing to meet their goals. Please

  17. Measuring the performance of vaccination programs using cross-sectional surveys: a likelihood framework and retrospective analysis.

    Directory of Open Access Journals (Sweden)

    Justin Lessler

    2011-10-01

    Full Text Available The performance of routine and supplemental immunization activities is usually measured by the administrative method: dividing the number of doses distributed by the size of the target population. This method leads to coverage estimates that are sometimes impossible (e.g., vaccination of 102% of the target population, and are generally inconsistent with the proportion found to be vaccinated in Demographic and Health Surveys (DHS. We describe a method that estimates the fraction of the population accessible to vaccination activities, as well as within-campaign inefficiencies, thus providing a consistent estimate of vaccination coverage.We developed a likelihood framework for estimating the effective coverage of vaccination programs using cross-sectional surveys of vaccine coverage combined with administrative data. We applied our method to measles vaccination in three African countries: Ghana, Madagascar, and Sierra Leone, using data from each country's most recent DHS survey and administrative coverage data reported to the World Health Organization. We estimate that 93% (95% CI: 91, 94 of the population in Ghana was ever covered by any measles vaccination activity, 77% (95% CI: 78, 81 in Madagascar, and 69% (95% CI: 67, 70 in Sierra Leone. "Within-activity" inefficiencies were estimated to be low in Ghana, and higher in Sierra Leone and Madagascar. Our model successfully fits age-specific vaccination coverage levels seen in DHS data, which differ markedly from those predicted by naïve extrapolation from country-reported and World Health Organization-adjusted vaccination coverage.Combining administrative data with survey data substantially improves estimates of vaccination coverage. Estimates of the inefficiency of past vaccination activities and the proportion not covered by any activity allow us to more accurately predict the results of future activities and provide insight into the ways in which vaccination programs are failing to meet their

  18. A Typology Framework of Loyalty Reward Programs

    Science.gov (United States)

    Cao, Yuheng; Nsakanda, Aaron Luntala; Mann, Inder Jit Singh

    Loyalty reward programs (LRPs), initially developed as marketing programs to enhance customer retention, have now become an important part of customer-focused business strategy. With the proliferation and increasing economy impact of the programs, the management complexity in the programs has also increased. However, despite widespread adoption of LRPs in business, academic research in the field seems to lag behind its practical application. Even the fundamental questions such as what LRPs are and how to classify them have not yet been fully addressed. In this paper, a comprehensive framework for LRP classification is proposed, which provides a foundation for further study of LRP design and planning issues.

  19. Conceptual framework for a Danish human biomonitoring program

    DEFF Research Database (Denmark)

    Thomsen, Marianne; Knudsen, Lisbeth E.; Vorkamp, Katrin

    2008-01-01

    for a structured and integrated environmental and human health surveillance program at national level. In Denmark, the initiative to implement such activities has been taken. The proposed framework of the Danish monitoring program constitutes four scientific expert groups, i.e. i. Prioritization of the strategy...... for the monitoring program, ii. Collection of human samples, iii. Analysis and data management and iv. Dissemination of results produced within the program. This paper presents the overall framework for data requirements and information flow in the integrated environment and health surveillance program. The added...... value of an HBM program, and in this respect the objectives of national and European HBM programs supporting environmental health integrated policy-decisions and human health targeted policies, are discussed.In Denmark environmental monitoring has been prioritized by extensive surveillance systems...

  20. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  1. A Framework for Analysis of Case Studies of Reading Lessons

    Science.gov (United States)

    Carlisle, Joanne F.; Kelcey, Ben; Rosaen, Cheryl; Phelps, Geoffrey; Vereb, Anita

    2013-01-01

    This paper focuses on the development and study of a framework to provide direction and guidance for practicing teachers in using a web-based case studies program for professional development in early reading; the program is called Case Studies Reading Lessons (CSRL). The framework directs and guides teachers' analysis of reading instruction by…

  2. Initial Multidisciplinary Design and Analysis Framework

    Science.gov (United States)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  3. Luiza: Analysis Framework for GLORIA

    Directory of Open Access Journals (Sweden)

    Aleksander Filip Żarnecki

    2013-01-01

    Full Text Available The Luiza analysis framework for GLORIA is based on the Marlin package, which was originally developed for data analysis in the new High Energy Physics (HEP project, International Linear Collider (ILC. The HEP experiments have to deal with enormous amounts of data and distributed data analysis is therefore essential. The Marlin framework concept seems to be well suited for the needs of GLORIA. The idea (and large parts of the code taken from Marlin is that every computing task is implemented as a processor (module that analyzes the data stored in an internal data structure, and the additional output is also added to that collection. The advantage of this modular approach is that it keeps things as simple as possible. Each step of the full analysis chain, e.g. from raw images to light curves, can be processed step-by-step, and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  4. X-framework: Space system failure analysis framework

    Science.gov (United States)

    Newman, John Steven

    Space program and space systems failures result in financial losses in the multi-hundred million dollar range every year. In addition to financial loss, space system failures may also represent the loss of opportunity, loss of critical scientific, commercial and/or national defense capabilities, as well as loss of public confidence. The need exists to improve learning and expand the scope of lessons documented and offered to the space industry project team. One of the barriers to incorporating lessons learned include the way in which space system failures are documented. Multiple classes of space system failure information are identified, ranging from "sound bite" summaries in space insurance compendia, to articles in journals, lengthy data-oriented (what happened) reports, and in some rare cases, reports that treat not only the what, but also the why. In addition there are periodically published "corporate crisis" reports, typically issued after multiple or highly visible failures that explore management roles in the failure, often within a politically oriented context. Given the general lack of consistency, it is clear that a good multi-level space system/program failure framework with analytical and predictive capability is needed. This research effort set out to develop such a model. The X-Framework (x-fw) is proposed as an innovative forensic failure analysis approach, providing a multi-level understanding of the space system failure event beginning with the proximate cause, extending to the directly related work or operational processes and upward through successive management layers. The x-fw focus is on capability and control at the process level and examines: (1) management accountability and control, (2) resource and requirement allocation, and (3) planning, analysis, and risk management at each level of management. The x-fw model provides an innovative failure analysis approach for acquiring a multi-level perspective, direct and indirect causation of

  5. COMP Superscalar, an interoperable programming framework

    Directory of Open Access Journals (Sweden)

    Rosa M. Badia

    2015-12-01

    Full Text Available COMPSs is a programming framework that aims to facilitate the parallelization of existing applications written in Java, C/C++ and Python scripts. For that purpose, it offers a simple programming model based on sequential development in which the user is mainly responsible for (i identifying the functions to be executed as asynchronous parallel tasks and (ii annotating them with annotations or standard Python decorators. A runtime system is in charge of exploiting the inherent concurrency of the code, automatically detecting and enforcing the data dependencies between tasks and spawning these tasks to the available resources, which can be nodes in a cluster, clouds or grids. In cloud environments, COMPSs provides scalability and elasticity features allowing the dynamic provision of resources.

  6. 76 FR 38602 - Bovine Tuberculosis and Brucellosis; Program Framework

    Science.gov (United States)

    2011-07-01

    ...] Bovine Tuberculosis and Brucellosis; Program Framework AGENCY: Animal and Plant Health Inspection Service... framework being developed for the bovine tuberculosis and brucellosis programs in the United States. This... proposed revisions to its programs regarding bovine tuberculosis (TB) and bovine brucellosis in the United...

  7. Legal framework for a nuclear program

    International Nuclear Information System (INIS)

    Santos, A. de los; Corretjer, L.

    1977-01-01

    Introduction of a nuclear program requires the establishment of an adequate legal framework as solutions to the problems posed by the use of nuclear energy are not included in Common Law. As far as Spain is concerned, legislation is capable of dealing with the main problems posed in this field. Spain is a Contracting Party in several International Conventions and participates in International Organizations related to this area and takes their recommendations into account when revising its national legislation. Specific Spanish legislation is constituted by Law 25/1964, of April 29th, on Nuclear Energy, which outlines the legal system regarding nuclear energy, and regulates all aspects which refer to same, from the competent organisms and authorities to the sanctions to be imposed for non-fulfilment of the provisions. In order to offer sufficient flexibility, so that it can be adapted to specific circumstances, the Law's provisions are very ample and development is foreseen by means of regulations. So far, two Regulations have been published: Regulation relating to Coverage of Risk of Nuclear Damage, which refers to Civil Responsibility and its Coverage; and Regulation relating to Nuclear and Radioactive Installations, which refers to the authorization and license system. At the present time, the Regulation relating to Radiation Protection is being elaborated and it will replace the present Radiation Protection Ordinances. In addition to the foregoing, reference is made to others which, although they are not specifically ''nuclear'', they include precepts related to this question, such as the Regulation regarding Nuisance, Unhealthy or Dangerous Industries or some Labor Law provisions [es

  8. Public health program capacity for sustainability: a new framework.

    Science.gov (United States)

    Schell, Sarah F; Luke, Douglas A; Schooley, Michael W; Elliott, Michael B; Herbers, Stephanie H; Mueller, Nancy B; Bunger, Alicia C

    2013-02-01

    Public health programs can only deliver benefits if they are able to sustain activities over time. There is a broad literature on program sustainability in public health, but it is fragmented and there is a lack of consensus on core constructs. The purpose of this paper is to present a new conceptual framework for program sustainability in public health. This developmental study uses a comprehensive literature review, input from an expert panel, and the results of concept-mapping to identify the core domains of a conceptual framework for public health program capacity for sustainability. The concept-mapping process included three types of participants (scientists, funders, and practitioners) from several public health areas (e.g., tobacco control, heart disease and stroke, physical activity and nutrition, and injury prevention). The literature review identified 85 relevant studies focusing on program sustainability in public health. Most of the papers described empirical studies of prevention-oriented programs aimed at the community level. The concept-mapping process identified nine core domains that affect a program's capacity for sustainability: Political Support, Funding Stability, Partnerships, Organizational Capacity, Program Evaluation, Program Adaptation, Communications, Public Health Impacts, and Strategic Planning. Concept-mapping participants further identified 93 items across these domains that have strong face validity-89% of the individual items composing the framework had specific support in the sustainability literature. The sustainability framework presented here suggests that a number of selected factors may be related to a program's ability to sustain its activities and benefits over time. These factors have been discussed in the literature, but this framework synthesizes and combines the factors and suggests how they may be interrelated with one another. The framework presents domains for public health decision makers to consider when developing

  9. CLARA: CLAS12 Reconstruction and Analysis Framework

    Energy Technology Data Exchange (ETDEWEB)

    Gyurjyan, Vardan [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Matta, Sebastian Mancilla [Santa Maria U., Valparaiso, Chile; Oyarzun, Ricardo [Santa Maria U., Valparaiso, Chile

    2016-11-01

    In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.

  10. Ecosystem Analysis Program

    International Nuclear Information System (INIS)

    Burgess, R.L.

    1978-01-01

    Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models

  11. Comparative analysis of enterprise architecture frameworks

    OpenAIRE

    Oblak, Danica

    2012-01-01

    Today's enterprises are facing a competitive power in the dynamically changing business environment. With increasing complexity of enterprise, enterprise architecture have become an important field. Creating an enterprise architecture can be complex task, so enterprise architecture framework were created to simplify the process and guide an architect through all areas of architecture development. This study concentrates on the comparative analysis of enterprise architecture frameworks. T...

  12. A Program Management Framework for Facilities Managers

    Science.gov (United States)

    King, Dan

    2012-01-01

    The challenge faced by senior facility leaders is not how to execute a single project, but rather, how to successfully execute a large program consisting of hundreds of projects. Senior facilities officers at universities, school districts, hospitals, airports, and other organizations with extensive facility inventories, typically manage project…

  13. Communicative automata based programming. Society Framework

    Directory of Open Access Journals (Sweden)

    Andrei Micu

    2015-10-01

    Full Text Available One of the aims of this paper is to present a new programming paradigm based on the new paradigms intensively used in IT industry. Implementation of these techniques can improve the quality of code through modularization, not only in terms of entities used by a program, but also in terms of states in which they pass. Another aspect followed in this paper takes into account that in the development of software applications, the transition from the design to the source code is a very expensive step in terms of effort and time spent. Diagrams can hide very important details for simplicity of understanding, which can lead to incorrect or incomplete implementations. To improve this process communicative automaton based programming comes with an intermediate step. We will see how it goes after creating modeling diagrams to communicative automata and then to writing code for each of them. We show how the transition from one step to another is much easier and intuitive.

  14. Framework for Developing a Multimodal Programming Interface Used on Industrial Robots

    Directory of Open Access Journals (Sweden)

    Bogdan Mocan

    2014-12-01

    Full Text Available The proposed approach within this paper shifts the focus from the coordinate based programming of an industrial robot, which currently dominates the field, to an object based programming scheme. The general framework proposed in this paper is designed to perform natural language understanding, gesture integration and semantic analysis which facilitate the development of a multimodal robot programming interface that facilitate an intuitive programming.

  15. Program Assessment Framework for a Rural Palliative Supportive Service

    Science.gov (United States)

    Pesut, Barbara; Hooper, Brenda; Sawatzky, Richard; Robinson, Carole A; Bottorff, Joan L; Dalhuisen, Miranda

    2013-01-01

    Although there are a number of quality frameworks available for evaluating palliative services, it is necessary to adapt these frameworks to models of care designed for the rural context. The purpose of this paper was to describe the development of a program assessment framework for evaluating a rural palliative supportive service as part of a community-based research project designed to enhance the quality of care for patients and families living with life-limiting chronic illness. A review of key documents from electronic databases and grey literature resulted in the identification of general principles for high-quality palliative care in rural contexts. These principles were then adapted to provide an assessment framework for the evaluation of the rural palliative supportive service. This framework was evaluated and refined using a community-based advisory committee guiding the development of the service. The resulting program assessment framework includes 48 criteria organized under seven themes: embedded within community; palliative care is timely, comprehensive, and continuous; access to palliative care education and experts; effective teamwork and communication; family partnerships; policies and services that support rural capacity and values; and systematic approach for measuring and improving outcomes of care. It is important to identify essential elements for assessing the quality of services designed to improve rural palliative care, taking into account the strengths of rural communities and addressing common challenges. The program assessment framework has potential to increase the likelihood of desired outcomes in palliative care provisions in rural settings and requires further validation. PMID:25278757

  16. Utilizing the Theoretical Framework of Collective Identity to Understand Processes in Youth Programs

    Science.gov (United States)

    Futch, Valerie A.

    2016-01-01

    This article explores collective identity as a useful theoretical framework for understanding social and developmental processes that occur in youth programs. Through narrative analysis of past participant interviews (n = 21) from an after-school theater program, known as "The SOURCE", it was found that participants very clearly describe…

  17. A Modified Importance-Performance Framework for Evaluating Recreation-Based Experiential Learning Programs

    Science.gov (United States)

    Pitas, Nicholas; Murray, Alison; Olsen, Max; Graefe, Alan

    2017-01-01

    This article describes a modified importance-performance framework for use in evaluation of recreation-based experiential learning programs. Importance-performance analysis (IPA) provides an effective and readily applicable means of evaluating many programs, but the near universal satisfaction associated with recreation inhibits the use of IPA in…

  18. A Framework for a Multi-Participant Gis Program

    OpenAIRE

    Nabar, Maneesha Mangesh

    1998-01-01

    The objective of this paper is to develop a well-defined and sound framework for the implementation of a multi-participant GIS program and to illustrate the developed framework by its application to the Departments of the Town of Blacksburg. A multi-participant approach to implementing GIS technology faces greater challenges than a single-participant GIS project, due to the unique culture, structure, policy, decision-making rule and expectations of participants from implementation of GIS ...

  19. Evaluating Prior Learning Assessment Programs: A Suggested Framework

    Directory of Open Access Journals (Sweden)

    Nan L. Travers and Marnie T. Evans

    2011-01-01

    Full Text Available Over the past two decades, American institutions have been expected to include systematic program reviews to meet accrediting standards, either by independent or governmental review agencies. Program evaluation is critical for several reasons: it provides systematic ways to assess what needs improvement or what needs changing and it provides ways to validate practices, whether to internal or external audiences (Mishra, 2007. Most program evaluative models are focused on academic programs, which don’t fit the uniqueness of prior learning assessment programs. This paper proposes an evaluative framework for prior learning assessment programs, which takes into account the type of work within prior learning assessment programs and uses program portfolios, similar to how students are asked to document their work.

  20. Event Reconstruction and Analysis in the R3BRoot Framework

    International Nuclear Information System (INIS)

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  1. A Resilient Program technical baseline framework for future space systems

    Science.gov (United States)

    Nguyen, Tien M.; Guillen, Andy T.; Matsunaga, Sumner S.

    2015-05-01

    Recent Better Buying Power (BBP) initiative for improving DoD's effectiveness in developing complex systems includes "Owning the Technical Baseline" (OTB). This paper presents an innovative approach for the development of a "Resilient Program" Technical Baseline Framework (PTBF). The framework provides a recipe for generating the "Resilient Program2" Technical Baseline (PTB) components using the Integrated Program Management (IPM) approach to integrate Key Program Elements (KPEs)3 with System Engineering (SE) process/tools, acquisition policy/process/tools, Cost and Schedule estimating tools, DOD Architecture Framework (DODAF) process/tools, Open System Architecture (OSA) process/tools, Risk Management process/tools, Critical Chain Program Management (CCPM) process, and Earned Value Management System (EVMS) process/tools. The proposed resilient framework includes a matrix that maps the required tools/processes to technical features of a comprehensive reference U.S. DOD "owned" technical baseline. Resilient PTBF employs a new Open System Approach (OSAP) combining existing OSA4 and NOA (Naval Open Architecture) frameworks, supplemented by additional proposed OA (Open Architecture) principles. The new OSAP being recommended to SMC (Space and Missiles Systems Center) presented in this paper is referred to as SMC-OSAP5. Resilient PTBF and SMC-OSAP conform to U.S. DOD Acquisition System (DAS), Joint Capabilities Integration and Development System (JCIDS), and DODAF processes. The paper also extends Ref. 21 on "Program Resiliency" concept by describing how the new OSAP can be used to align SMC acquisition management with DOD BBP 3.0 and SMC's vison for resilient acquisition and sustainment efforts.

  2. Biodiesel Emissions Analysis Program

    Science.gov (United States)

    Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.

  3. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    of a pre-existing, open-source modeling and analysis framework known as Ptolemy II (http://ptolemy.org). The University of California, Berkeley...worked with the Air Force Research Laboratory, Rome Research Site on adapting Ptolemy II for modeling and simulation of large scale dynamics of Political...capabilities were prototyped in Ptolemy II and delivered via version control and software releases. Each of these capabilities specifically supports one or

  4. The 7 th framework program of the EU

    International Nuclear Information System (INIS)

    Gonzalez, E. M.; Serrano, J. A.

    2007-01-01

    The framework Program is the principal community initiative for fostering and supporting R and D in the European Union. its main goal is to improve competitiveness by fundamentally financing research, technological development, demonstration and innovation activities through transnational collaboration between research institutes and firms belong to both the European Union countries and States affiliated as third countries. In addition, it provides financial support to enhancement and coordination of European research infrastructures, promotion and training of research personnel, basic research and, particularly as of the current 7th Framework Program, coordination of national R and D programs and impllementation of European technology platforms (PTEs), which have been conveived to promote strategic research agendas in key sectors with the cooperation of all the involved players. In the wake of the PTEs, different national platforms have been implemented at the national level which are very active in different sectors. (Authors)

  5. The LHC Post Mortem Analysis Framework

    CERN Document Server

    Andreassen, O O; Castaneda, A; Gorbonosov, R; Khasbulatov, D; Reymond, H; Rijllart, A; Romera Ramirez, I; Trofimov, N; Zerlauth, M

    2010-01-01

    The LHC with its unprecedented complexity and criticality of beam operation will need thorough analysis of data taken from systems such as power converters, interlocks and beam instrumentation during events like magnet quenches and beam loss. The causes of beam aborts or in the worst case equipment damage have to be revealed to improve operational procedures and protection systems. The correct functioning of the protection systems with their required redundancy has to be verified after each such event. Post mortem analysis software for the control room has been prepared with automated analysis packages in view of the large number of systems and data volume. This paper recalls the requirements for the LHC Beam Post Mortem System (PM) and the necessity for highly reliable data collection. It describes in detail the redundant architecture for data collection as well as the chosen implementation of a multi-level analysis framework, allowing for automated analysis and qualification of a beam dump event based on ex...

  6. Structural Analysis in a Conceptual Design Framework

    Science.gov (United States)

    Padula, Sharon L.; Robinson, Jay H.; Eldred, Lloyd B.

    2012-01-01

    Supersonic aircraft designers must shape the outer mold line of the aircraft to improve multiple objectives, such as mission performance, cruise efficiency, and sonic-boom signatures. Conceptual designers have demonstrated an ability to assess these objectives for a large number of candidate designs. Other critical objectives and constraints, such as weight, fuel volume, aeroelastic effects, and structural soundness, are more difficult to address during the conceptual design process. The present research adds both static structural analysis and sizing to an existing conceptual design framework. The ultimate goal is to include structural analysis in the multidisciplinary optimization of a supersonic aircraft. Progress towards that goal is discussed and demonstrated.

  7. Framework for analysis of guaranteed QOS systems

    Science.gov (United States)

    Chaudhry, Shailender; Choudhary, Alok

    1997-01-01

    Multimedia data is isochronous in nature and entails managing and delivering high volumes of data. Multiprocessors with their large processing power, vast memory, and fast interconnects, are an ideal candidate for the implementation of multimedia applications. Initially, multiprocessors were designed to execute scientific programs and thus their architecture was optimized to provide low message latency and efficiently support regular communication patterns. Hence, they have a regular network topology and most use wormhole routing. The design offers the benefits of a simple router, small buffer size, and network latency that is almost independent of path length. Among the various multimedia applications, video on demand (VOD) server is well-suited for implementation using parallel multiprocessors. Logical models for VOD servers are presently mapped onto multiprocessors. Our paper provides a framework for calculating bounds on utilization of system resources with which QoS parameters for each isochronous stream can be guaranteed. Effects of the architecture of multiprocessors, and efficiency of various local models and mapping on particular architectures can be investigated within our framework. Our framework is based on rigorous proofs and provides tight bounds. The results obtained may be used as the basis for admission control tests. To illustrate the versatility of our framework, we provide bounds on utilization for various logical models applied to mesh connected architectures for a video on demand server. Our results show that worm hole routing can lead to packets waiting for transmission of other packets that apparently share no common resources. This situation is analogous to head-of-the-line blocking. We find that the provision of multiple VCs per link and multiple flit buffers improves utilization (even under guaranteed QoS parameters). This analogous to parallel iterative matching.

  8. Development Context Driven Change Awareness and Analysis Framework

    Science.gov (United States)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  9. ComPWA: A common amplitude analysis framework for PANDA

    International Nuclear Information System (INIS)

    Michel, M; Feldbauer, F; Götzen, K; Jasinski, P; Peters, K; Fritsch, M; Karavdina, A

    2014-01-01

    A large part of the physics program of the PANDA experiment at FAIR deals with the search for new conventional and exotic hadronic states like e.g. hybrids and glueballs. For many analyses PANDA will need an amplitude analysis, e.g. a partial wave analysis (PWA), to identify possible candidates and for the classification of known states. Therefore, a new, agile and efficient amplitude analysis framework ComPWA is under development. It is modularized to provide easy extension with models and formalisms as well as fitting of multiple datasets, even from different experiments. Experience from existing PWA programs was used to fix the requirements of the framework and to prevent it from restrictions. It will provide the standard estimation and optimization routines like Minuit2 and the Geneva library and be open to insert additional ones. The challenges involve parallelization, fitting with a high number of free parameters, managing complex meta-fits and quality assurance / comparability of fits. To test and develop the software, it will be used with data from running experiments like BaBar or BESIII. These proceedings show the status of the framework implementation as well as first test results.

  10. ComPWA: A common amplitude analysis framework for PANDA

    Science.gov (United States)

    Michel, M.; Feldbauer, F.; Götzen, K.; Jasinski, P.; Karavdina, A.; Peters, K.; Fritsch, M.

    2014-06-01

    A large part of the physics program of the PANDA experiment at FAIR deals with the search for new conventional and exotic hadronic states like e.g. hybrids and glueballs. For many analyses PANDA will need an amplitude analysis, e.g. a partial wave analysis (PWA), to identify possible candidates and for the classification of known states. Therefore, a new, agile and efficient amplitude analysis framework ComPWA is under development. It is modularized to provide easy extension with models and formalisms as well as fitting of multiple datasets, even from different experiments. Experience from existing PWA programs was used to fix the requirements of the framework and to prevent it from restrictions. It will provide the standard estimation and optimization routines like Minuit2 and the Geneva library and be open to insert additional ones. The challenges involve parallelization, fitting with a high number of free parameters, managing complex meta-fits and quality assurance / comparability of fits. To test and develop the software, it will be used with data from running experiments like BaBar or BESIII. These proceedings show the status of the framework implementation as well as first test results.

  11. The Measurand Framework: Scaling Exploratory Data Analysis

    Science.gov (United States)

    Schneider, D.; MacLean, L. S.; Kappler, K. N.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired a unique dataset with outstanding spatial and temporal sampling of earth's time varying magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. In order to analyze this sizable dataset, QF has developed an analytical framework to support processing the time series input data and hypothesis testing to evaluate the statistical significance of potential precursory signals. The framework was developed with a need to support legacy, in-house processing but with an eye towards big-data processing with Apache Spark and other modern big data technologies. In this presentation, we describe our framework, which supports rapid experimentation and iteration of candidate signal processing techniques via modular data transformation stages, tracking of provenance, and automatic re-computation of downstream data when upstream data is updated. Furthermore, we discuss how the processing modules can be ported to big data platforms like Apache Spark and demonstrate a migration path from local, in-house processing to cloud-friendly processing.

  12. A flexible framework for secure and efficient program obfuscation.

    Energy Technology Data Exchange (ETDEWEB)

    Solis, John Hector

    2013-03-01

    In this paper, we present a modular framework for constructing a secure and efficient program obfuscation scheme. Our approach, inspired by the obfuscation with respect to oracle machines model of [4], retains an interactive online protocol with an oracle, but relaxes the original computational and storage restrictions. We argue this is reasonable given the computational resources of modern personal devices. Furthermore, we relax the information-theoretic security requirement for computational security to utilize established cryptographic primitives. With this additional flexibility we are free to explore different cryptographic buildingblocks. Our approach combines authenticated encryption with private information retrieval to construct a secure program obfuscation framework. We give a formal specification of our framework, based on desired functionality and security properties, and provide an example instantiation. In particular, we implement AES in Galois/Counter Mode for authenticated encryption and the Gentry-Ramzan [13]constant communication-rate private information retrieval scheme. We present our implementation results and show that non-trivial sized programs can be realized, but scalability is quickly limited by computational overhead. Finally, we include a discussion on security considerations when instantiating specific modules.

  13. Aura: A Multi-Featured Programming Framework in Python

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available This paper puts forward the design, programming and application of innovative educational software, ‘Aura’ made using Python and PyQt Python bindings. The research paper presents a new concept of using a single tool to relate between syntaxes of various programming languages and algorithms. It radically increases their understanding and retaining capacity, since they can correlate between many programming languages. The software is a totally unorthodox attempt towards helping students who have their first tryst with programming languages. The application is designed to help students understand how algorithms work and thus, help them in learning multiple programming languages on a single platform using an interactive graphical user interface. This paper elucidates how using Python and PyQt bindings, a comprehensive feature rich application, that implements an interactive algorithm building technique, a web browser, multiple programming language framework, a code generator and a real time code sharing hub be embedded into a single interface. And also explains, that using Python as building tool, it requires much less coding than conventional feature rich applications coded in other programming languages, and at the same time does not compromise on stability, inter-operability and robustness of the application.

  14. High Speed Simulation Framework for Reliable Logic Programs

    International Nuclear Information System (INIS)

    Lee, Wan-Bok; Kim, Seog-Ju

    2006-01-01

    This paper shows a case study of designing a PLC logic simulator that was developed to simulate and verify PLC control programs for nuclear plant systems. The nuclear control system requires strict restrictions rather than normal process control system does, since it works with nuclear power plants requiring high reliability under severe environment. One restriction is the safeness of the control programs which can be assured by exploiting severe testing. Another restriction is the simulation speed of the control programs, that should be fast enough to control multi devices concurrently in real-time. To cope with these restrictions, we devised a logic compiler which generates C-code programs from given PLC logic programs. Once the logic program was translated into C-code, the program could be analyzed by conventional software analysis tools and could be used to construct a fast logic simulator after cross-compiling, in fact, that is a kind of compiled-code simulation

  15. The Event Coordination Notation: Execution Engine and Programming Framework

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2012-01-01

    that was written manually. In this paper, we rephrase the main concepts of ECNO. The focus of this paper, however, is on the architecture of the ECNO execution engine and its programming framework. We will show how this framework allows us to integrate ECNO with object-oriented models, how it works without any......ECNO (Event Coordination Notation) is a notation for modelling the behaviour of a software system on top of some object-oriented data model. ECNO has two main objectives: On the one hand, ECNO should allow modelling the behaviour of a system on the domain level; on the other hand, it should...... be possible to completely generate code from ECNO and the underlying object-oriented domain models. Today, there are several approaches that would allow to do this. But, most of them would require that the data models and the behaviour models are using the same technology and the code is generated together...

  16. Design and Analysis of Web Application Frameworks

    DEFF Research Database (Denmark)

    Schwarz, Mathias Romme

    Numerous web application frameworks have been developed in recent years. These frameworks enable programmers to reuse common components and to avoid typical pitfalls in web application development. Although such frameworks help the programmer to avoid many common errors, we nd...... that there are important, common errors that remain unhandled by web application frameworks. Guided by a survey of common web application errors and of web application frameworks, we identify the need for techniques to help the programmer avoid HTML invalidity and security vulnerabilities, in particular client......-state manipulation vulnerabilities. The hypothesis of this dissertation is that we can design frameworks and static analyses that aid the programmer to avoid such errors. First, we present the JWIG web application framework for writing secure and maintainable web applications. We discuss how this framework solves...

  17. Evaluation and Policy Analysis: A Communicative Framework

    Directory of Open Access Journals (Sweden)

    Cynthia Wallat

    1997-07-01

    Full Text Available A major challenge for the next generation of students of human development is to help shape the paradigms by which we analyze and evaluate public policies for children and families. Advocates of building research and policy connections point to health care and stress experiences across home, school, and community as critical policy issues that expand the scope of contexts and outcomes studied. At a minimum, development researchers and practitioners will need to be well versed in available methods of inquiry; they will need to be "methodologically multilingual" when conducting evaluation and policy analysis, producing reports, and reporting their interpretations to consumer and policy audiences. This article suggests how traditional approaches to policy inquiry can be reconsidered in light of these research inquiry and communicative skills needed by all policy researchers. A fifteen year review of both policy and discourse processes research is presented to suggest ways to conduct policy studies within a communicative framework.

  18. An Analysis of Massachusetts Department of Elementary and Secondary Education Vocational Technical Education Framework for Culinary Arts and Its Effectiveness on Students Enrolled in Post-Secondary Culinary Programs

    Science.gov (United States)

    D'Addario, Albert S.

    2011-01-01

    This field-based action research practicum investigated how students who have completed culinary training programs in Massachusetts public secondary schools perform in post-secondary coursework. The Department of Elementary and Secondary Education has developed the Vocational Technical Education (VTE) Framework for Culinary Arts that outlines…

  19. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  20. Initial Risk Analysis and Decision Making Framework

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  1. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    OpenAIRE

    Paweł Sitek; Jarosław Wikarek

    2016-01-01

    This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs) and constraint optimization problems (COPs). Two paradigms, CLP (constraint logic programming) and MP (mathematical programming), are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework a...

  2. Mississippi Curriculum Framework for Welding and Cutting Programs (Program CIP: 48.0508--Welder/Welding Technologist). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the welding and cutting programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and…

  3. Mississippi Curriculum Framework for Banking & Finance Technology (Program CIP: 52.0803--Banking and Related Financial Programs, Other). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the banking and finance technology program. Presented in the introduction are a program description and suggested course sequence. Section I is a curriculum guide consisting of outlines for…

  4. A Robust Actin Filaments Image Analysis Framework.

    Directory of Open Access Journals (Sweden)

    Mitchel Alioscha-Perez

    2016-08-01

    Full Text Available The cytoskeleton is a highly dynamical protein network that plays a central role in numerous cellular physiological processes, and is traditionally divided into three components according to its chemical composition, i.e. actin, tubulin and intermediate filament cytoskeletons. Understanding the cytoskeleton dynamics is of prime importance to unveil mechanisms involved in cell adaptation to any stress type. Fluorescence imaging of cytoskeleton structures allows analyzing the impact of mechanical stimulation in the cytoskeleton, but it also imposes additional challenges in the image processing stage, such as the presence of imaging-related artifacts and heavy blurring introduced by (high-throughput automated scans. However, although there exists a considerable number of image-based analytical tools to address the image processing and analysis, most of them are unfit to cope with the aforementioned challenges. Filamentous structures in images can be considered as a piecewise composition of quasi-straight segments (at least in some finer or coarser scale. Based on this observation, we propose a three-steps actin filaments extraction methodology: (i first the input image is decomposed into a 'cartoon' part corresponding to the filament structures in the image, and a noise/texture part, (ii on the 'cartoon' image, we apply a multi-scale line detector coupled with a (iii quasi-straight filaments merging algorithm for fiber extraction. The proposed robust actin filaments image analysis framework allows extracting individual filaments in the presence of noise, artifacts and heavy blurring. Moreover, it provides numerous parameters such as filaments orientation, position and length, useful for further analysis. Cell image decomposition is relatively under-exploited in biological images processing, and our study shows the benefits it provides when addressing such tasks. Experimental validation was conducted using publicly available datasets, and in

  5. The international framework for safeguarding peaceful nuclear energy programs

    International Nuclear Information System (INIS)

    Mazer, B.M.

    1980-01-01

    International law, in response to the need for safeguard assurances, has provided a framework which can be utilized by supplier and recipient states. Multilateral treaties have created the International Atomic Energy Agency which can serve a vital role in the establishment and supervision of safeguard agreements for nuclear energy programs. The Non-Proliferation Treaty has created definite obligations on nuclear-weapon and non-nuclear weapon states to alleviate some possibilities of proliferation and has rejuvenated the function of the IAEA in providing safeguards, especially to non-nuclear-weapon states which are parties to the Non-Proliferation treaty. States which are not parties to the Non-Proliferation Treaty may receive nuclear energy co-operation subject to IAEA safeguards. States like Canada, have insisted through the bilateral nuclear energy co-operation agreements that either individual or joint agreement be reached with the IAEA for the application of safeguards. Trilateral treaties among Canada, the recipient state and the IAEA have been employed and can provide the necessary assurances against the diversion of peaceful nuclear energy programs to military or non-peaceful uses. The advent of the Nuclear Suppliers Group and its guidlines has definitely advanced the cause of ensuring peaceful uses of nuclear energy. The ultimate objective should be the creation of an international structure incorporating the application of the most comprehensive safeguards which will be applied universally to all nuclear energy programs

  6. Analysis of Implicit Uncertain Systems. Part 1: Theoretical Framework

    National Research Council Canada - National Science Library

    Paganini, Fernando; Doyle, John

    1994-01-01

    This paper introduces a general and powerful framework for the analysis of uncertain systems, encompassing linear fractional transformations, the behavioral approach for system theory and the integral...

  7. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    Science.gov (United States)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  8. Environmental conditions analysis program

    International Nuclear Information System (INIS)

    Holten, J.

    1991-01-01

    The PC-based program discussed in this paper has the capability of determining the steady state temperatures of environmental zones (rooms). A program overview will be provided along with examples of formula use. Required input and output from the program will also be discussed. Specific application of plant monitored temperatures and utilization of this program will be offered. The presentation will show how the program can project individual room temperature profiles without continual temperature monitoring of equipment. A discussion will also be provided for the application of the program generated data. Evaluations of anticipated or planned plant modifications and the use of the subject program will also be covered

  9. Program Theory Evaluation: Logic Analysis

    Science.gov (United States)

    Brousselle, Astrid; Champagne, Francois

    2011-01-01

    Program theory evaluation, which has grown in use over the past 10 years, assesses whether a program is designed in such a way that it can achieve its intended outcomes. This article describes a particular type of program theory evaluation--logic analysis--that allows us to test the plausibility of a program's theory using scientific knowledge.…

  10. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  11. The Army's Occupational Analysis Program

    National Research Council Canada - National Science Library

    1996-01-01

    .... The OA Program is to be the Army's center of excellence for job analysis and design. The program is in a transition period, adapting its procedures and methods to meet the needs of today's fast-paced Army...

  12. A Simulation Modeling Framework to Optimize Programs Using Financial Incentives to Motivate Health Behavior Change.

    Science.gov (United States)

    Basu, Sanjay; Kiernan, Michaela

    2016-01-01

    While increasingly popular among mid- to large-size employers, using financial incentives to induce health behavior change among employees has been controversial, in part due to poor quality and generalizability of studies to date. Thus, fundamental questions have been left unanswered: To generate positive economic returns on investment, what level of incentive should be offered for any given type of incentive program and among which employees? We constructed a novel modeling framework that systematically identifies how to optimize marginal return on investment from programs incentivizing behavior change by integrating commonly collected data on health behaviors and associated costs. We integrated "demand curves" capturing individual differences in response to any given incentive with employee demographic and risk factor data. We also estimated the degree of self-selection that could be tolerated: that is, the maximum percentage of already-healthy employees who could enroll in a wellness program while still maintaining positive absolute return on investment. In a demonstration analysis, the modeling framework was applied to data from 3000 worksite physical activity programs across the nation. For physical activity programs, the incentive levels that would optimize marginal return on investment ($367/employee/year) were higher than average incentive levels currently offered ($143/employee/year). Yet a high degree of self-selection could undermine the economic benefits of the program; if more than 17% of participants came from the top 10% of the physical activity distribution, the cost of the program would be expected to always be greater than its benefits. Our generalizable framework integrates individual differences in behavior and risk to systematically estimate the incentive level that optimizes marginal return on investment. © The Author(s) 2015.

  13. A Process Evaluation Framework for a Long-Term Care Garden Program

    OpenAIRE

    Willson, Brittany Nicole

    2016-01-01

    The purpose of this project is to develop a logic model and process evaluation framework for a therapeutic garden program at Banfield Pavilion, a residential long term care facility located in Vancouver, B.C. A six-step process evaluation design developed by Saunders, Evans, and Joshi (2005) is used to develop the evaluation framework. Steps of the framework include (1) describing the program, (2) describing complete and acceptable delivery of the program, (3) developing the potential list of...

  14. Hydrogeochemical framework and factor analysis of fluoride ...

    African Journals Online (AJOL)

    Fluoride contamination of groundwater within the Savelugu-Nanton District was assessed using hydrogeochemical framework and multivariate statistical approach. Eighty-one (No) boreholes were sampled for quality assessment in May and June 2008. The main objective of this study was to assess the fluoride levels in ...

  15. Linux Incident Response Volatile Data Analysis Framework

    Science.gov (United States)

    McFadden, Matthew

    2013-01-01

    Cyber incident response is an emphasized subject area in cybersecurity in information technology with increased need for the protection of data. Due to ongoing threats, cybersecurity imposes many challenges and requires new investigative response techniques. In this study a Linux Incident Response Framework is designed for collecting volatile data…

  16. Framework for Analysis of Mitigation in Courts

    Science.gov (United States)

    2005-01-01

    courtroom constructions of social identity from the perspective of the defendant. Semiotica 71, 261-284. Andenaes, Johannes, 1968. The legal framework...theory of communication especially mitigation and issues related to the trial as a social activity. For instance, non- turn-taking confirmations by...credibility, and guilt issues, but directed to social face-work whereas mitigation in juridical discourse has also relevance to the defense or to the case

  17. A multilevel evolutionary framework for sustainability analysis

    Directory of Open Access Journals (Sweden)

    Timothy M. Waring

    2015-06-01

    Full Text Available Sustainability theory can help achieve desirable social-ecological states by generalizing lessons across contexts and improving the design of sustainability interventions. To accomplish these goals, we argue that theory in sustainability science must (1 explain the emergence and persistence of social-ecological states, (2 account for endogenous cultural change, (3 incorporate cooperation dynamics, and (4 address the complexities of multilevel social-ecological interactions. We suggest that cultural evolutionary theory broadly, and cultural multilevel selection in particular, can improve on these fronts. We outline a multilevel evolutionary framework for describing social-ecological change and detail how multilevel cooperative dynamics can determine outcomes in environmental dilemmas. We show how this framework complements existing sustainability frameworks with a description of the emergence and persistence of sustainable institutions and behavior, a means to generalize causal patterns across social-ecological contexts, and a heuristic for designing and evaluating effective sustainability interventions. We support these assertions with case examples from developed and developing countries in which we track cooperative change at multiple levels of social organization as they impact social-ecological outcomes. Finally, we make suggestions for further theoretical development, empirical testing, and application.

  18. A Practical Iterative Framework for Qualitative Data Analysis

    OpenAIRE

    Prachi Srivastava DPhil; Nick Hopwood DPhil

    2009-01-01

    The role of iteration in qualitative data analysis, not as a repetitive mechanical task but as a reflexive process, is key to sparking insight and developing meaning. In this paper the authors presents a simple framework for qualitative data analysis comprising three iterative questions. The authors developed it to analyze qualitative data and to engage with the process of continuous meaning-making and progressive focusing inherent to analysis processes. They briefly present the framework and...

  19. The SBIRT program matrix: a conceptual framework for program implementation and evaluation.

    Science.gov (United States)

    Del Boca, Frances K; McRee, Bonnie; Vendetti, Janice; Damon, Donna

    2017-02-01

    Screening, Brief Intervention and Referral to Treatment (SBIRT) is a comprehensive, integrated, public health approach to the delivery of services to those at risk for the adverse consequences of alcohol and other drug use, and for those with probable substance use disorders. Research on successful SBIRT implementation has lagged behind studies of efficacy and effectiveness. This paper (1) outlines a conceptual framework, the SBIRT Program Matrix, to guide implementation research and program evaluation and (2) specifies potential implementation outcomes. Overview and narrative description of the SBIRT Program Matrix. The SBIRT Program Matrix has five components, each of which includes multiple elements: SBIRT services; performance sites; provider attributes; patient/client populations; and management structure and activities. Implementation outcomes include program adoption, acceptability, appropriateness, feasibility, fidelity, costs, penetration, sustainability, service provision and grant compliance. The Screening, Brief Intervention and Referral to Treatment Program Matrix provides a template for identifying, classifying and organizing the naturally occurring commonalities and variations within and across SBIRT programs, and for investigating which variables are associated with implementation success and, ultimately, with treatment outcomes and other impacts. © 2017 Society for the Study of Addiction.

  20. Environmental risk analysis for nanomaterials: Review and evaluation of frameworks

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    2012-01-01

    In response to the challenges of conducting traditional human health and ecological risk assessment for nanomaterials (NM), a number of alternative frameworks have been proposed for NM risk analysis. This paper evaluates various risk analysis frameworks proposed for NM based on a number of criteria...... to occupational settings with minor environmental considerations, and most have not been thoroughly tested on a wide range of NM. Care should also be taken when selecting the most appropriate risk analysis strategy for a given risk context. Given this, we recommend a multi-faceted approach to assess...... the environmental risks of NM as well as increased applications and testing of the proposed frameworks for different NM....

  1. Hierarchical Scheduling Framework Based on Compositional Analysis Using Uppaal

    DEFF Research Database (Denmark)

    Boudjadar, Jalil; David, Alexandre; Kim, Jin Hyun

    2014-01-01

    , which are some of the inputs for the parameterized timed automata that make up the framework. Components may have different scheduling policies, and each component is analyzed independently using Uppaal. We have applied our framework for the schedulability analysis of an avionics system....

  2. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    ABSTRACT: Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical. Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a ...

  3. Framework of awareness : For the analysis of ergonomics in design

    NARCIS (Netherlands)

    Vieira, S.; Badke-Schaub, P.G.; Fernandes, A.

    2015-01-01

    The present paper introduces the Framework of Awarenessto the analysis of ergonomics in design. The framework is part of a doctoral research that took the Lean Thinking perspective by adopting the concept of MUDA and its set of principles as dimensions to study designers’ behaviour in

  4. Logical Framework Analysis (LFA): An Essential Tool for Designing ...

    African Journals Online (AJOL)

    Evaluation of a project at any stage of its life cycle, especially at its planning stage, is necessary for its successful execution and completion. The Logical Framework Analysis or the Logical Framework Approach (LFA) is an essential tool in designing such evaluation because it is a process that serves as a reference guide in ...

  5. Static Analysis of Mobile Programs

    Science.gov (United States)

    2017-02-01

    and not allowed, to do. The second issue was that a fully static analysis was never a realistic possibility, because Java , the programming langauge...not justified by the test data). This idea came to define the project: use dynamic analyiss to guess the correct properties a program points of interest...scale to large programs it had to handle essentially all of the features of Java and could also be used as a general-purpose analysis engine. The

  6. Risk and train control : a framework for analysis

    Science.gov (United States)

    2001-01-01

    This report develops and demonstrates a framework for examining the effects of various train control strategies on some of the major risks of railroad operations. Analysis of hypothetical 1200-mile corridor identified the main factors that increase r...

  7. Geologic Framework Model Analysis Model Report

    International Nuclear Information System (INIS)

    Clayton, R.

    2000-01-01

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and

  8. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  9. Security Analysis of Parlay/OSA Framework

    NARCIS (Netherlands)

    Corin, R.J.; Di Caprio, G.; Etalle, Sandro; Gnesi, S.; Lenzini, Gabriele; Moiso, C.

    This paper analyzes the security of the Trust and Security Management (TSM) protocol, an authentication protocol which is part of the Parlay/OSA Application Program Interfaces (APIs). Architectures based on Parlay/OSA APIs allow third party service providers to develop new services that can access,

  10. Security Analysis of Parlay/OSA Framework

    NARCIS (Netherlands)

    Corin, R.J.; Di Caprio, G.; Etalle, Sandro; Gnesi, S.; Lenzini, Gabriele; Moiso, C.; Villain, B.

    2004-01-01

    This paper analyzes the security of the Trust and Security Management (TSM) protocol, an authentication protocol which is part of the Parlay/OSA Application Program Interfaces (APIs). Architectures based on Parlay/OSA APIs allow third party service providers to develop new services that can access,

  11. Mississippi Curriculum Framework for Horticulture Technology Cluster (Program CIP: 01.0601--Horticulture Serv. Op. & Mgmt., Gen.) (Program CIP: 01.0605--Landscaping Op. & Mgmt.). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the horticulture technology programs cluster. Presented in the introductory section are a framework of programs and courses, description of the programs, and suggested course sequences for…

  12. Program Analysis as Model Checking

    DEFF Research Database (Denmark)

    Olesen, Mads Chr.

    and abstract interpretation. Model checking views the program as a finite automaton and tries to prove logical properties over the automaton model, or present a counter-example if not possible — with a focus on precision. Abstract interpretation translates the program semantics into abstract semantics...... are considered, among others numerical analysis of c programs, and worst-case execution time analysis of ARM programs. It is shown how lattice automata allow automatic and manual tuning of the precision and efficiency of the verification procedure. In the case of worst-case execution time analysis a sound......Software programs are proliferating throughout modern life, to a point where even the simplest appliances such as lightbulbs contain software, in addition to the software embedded in cars and airplanes. The correct functioning of these programs is therefore of the utmost importance, for the quality...

  13. SPATIAL ANALYSIS FRAMEWORK FOR MANGROVE FORESTS RESTORATION

    Directory of Open Access Journals (Sweden)

    Arimatéa de Carvalho Ximenes

    2016-09-01

    Full Text Available Mangroves are coastal ecosystems in transition between sea and land, localized worldwide on the tropical and subtropical regions. However, anthropogenic pressure in coastal areas has led to the conversion of many mangrove areas to other uses. Due to the increased awareness of the importance of mangroves worldwide, restoration methods are being studied. Our aim is to develop a framework for selecting suitable sites for red mangrove planting using Geographic Information Systems (GIS. For this reason, the methodology is based on abiotic factors that have an influence on the zonation (distribution and growing of the Rhizophora mangle. A total suitable area of 6,12 hectares was found, where 15.300 propagules could be planted.

  14. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  15. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    This paper described the mathematical basis and computational framework of a computer program developed for short circuit studies of electric power systems. The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric ...

  16. Robust Sensitivity Analysis of the Optimal Value of Linear Programming

    OpenAIRE

    Xu, Guanglin; Burer, Samuel

    2015-01-01

    We propose a framework for sensitivity analysis of linear programs (LPs) in minimization form, allowing for simultaneous perturbations in the objective coefficients and right-hand sides, where the perturbations are modeled in a compact, convex uncertainty set. This framework unifies and extends multiple approaches for LP sensitivity analysis in the literature and has close ties to worst-case linear optimization and two-stage adaptive optimization. We define the minimum (best-case) and maximum...

  17. Interactive cutting path analysis programs

    Science.gov (United States)

    Weiner, J. M.; Williams, D. S.; Colley, S. R.

    1975-01-01

    The operation of numerically controlled machine tools is interactively simulated. Four programs were developed to graphically display the cutting paths for a Monarch lathe, Cintimatic mill, Strippit sheet metal punch, and the wiring path for a Standard wire wrap machine. These programs are run on a IMLAC PDS-ID graphic display system under the DOS-3 disk operating system. The cutting path analysis programs accept input via both paper tape and disk file.

  18. Liquid Effluents Program mission analysis

    International Nuclear Information System (INIS)

    Lowe, S.S.

    1994-01-01

    Systems engineering is being used to identify work to cleanup the Hanford Site. The systems engineering process transforms an identified mission need into a set of performance parameters and a preferred system configuration. Mission analysis is the first step in the process. Mission analysis supports early decision-making by clearly defining the program objectives, and evaluating the feasibility and risks associated with achieving those objectives. The results of the mission analysis provide a consistent basis for subsequent systems engineering work. A mission analysis was performed earlier for the overall Hanford Site. This work was continued by a ''capstone'' team which developed a top-level functional analysis. Continuing in a top-down manner, systems engineering is now being applied at the program and project levels. A mission analysis was conducted for the Liquid Effluents Program. The results are described herein. This report identifies the initial conditions and acceptable final conditions, defines the programmatic and physical interfaces and sources of constraints, estimates the resources to carry out the mission, and establishes measures of success. The mission analysis reflects current program planning for the Liquid Effluents Program as described in Liquid Effluents FY 1995 Multi-Year Program Plan

  19. A Framework for Behavior-Based Malware Analysis in the Cloud

    Science.gov (United States)

    Martignoni, Lorenzo; Paleari, Roberto; Bruschi, Danilo

    To ease the analysis of potentially malicious programs, dynamic behavior-based techniques have been proposed in the literature. Unfortunately, these techniques often give incomplete results because the execution environments in which they are performed are synthetic and do not faithfully resemble the environments of end-users, the intended targets of the malicious activities. In this paper, we present a new framework for improving behavior-based analysis of suspicious programs. Our framework allows an end-user to delegate security labs, the cloud, the execution and the analysis of a program and to force the program to behave as if it were executed directly in the environment of the former. The evaluation demonstrated that the proposed framework allows security labs to improve the completeness of the analysis, by analyzing a piece of malware on behalf of multiple end-users simultaneously, while performing a fine-grained analysis of the behavior of the program with no computational cost for end-users.

  20. Framework for T-tail flutter analysis

    CSIR Research Space (South Africa)

    Van Zyl, Lourens H

    2011-06-01

    Full Text Available Calculating unsteady generalized aerodynamic forces for a given Mach number, reduced frequency, steady load distribution and deflected shape The p-k flutter solver was modified to Call the DLM routine once to read the geometry At each speed increment... for that condition can be calculated. The present paper presents a procedure for performing a flutter analysis of a T-tailed aircraft in a given loading condition over a range of speeds. The first step at each speed is the aeroelastic trim analysis. Flutter...

  1. Medicare Part D Program Analysis

    Data.gov (United States)

    U.S. Department of Health & Human Services — This page contains information on Part D program analysis performed by CMS. These reports will also be used to better identify, evaluate and measure the effects of...

  2. Establishing a framework for comparative analysis of genome sequences

    Energy Technology Data Exchange (ETDEWEB)

    Bansal, A.K.

    1995-06-01

    This paper describes a framework and a high-level language toolkit for comparative analysis of genome sequence alignment The framework integrates the information derived from multiple sequence alignment and phylogenetic tree (hypothetical tree of evolution) to derive new properties about sequences. Multiple sequence alignments are treated as an abstract data type. Abstract operations have been described to manipulate a multiple sequence alignment and to derive mutation related information from a phylogenetic tree by superimposing parsimonious analysis. The framework has been applied on protein alignments to derive constrained columns (in a multiple sequence alignment) that exhibit evolutionary pressure to preserve a common property in a column despite mutation. A Prolog toolkit based on the framework has been implemented and demonstrated on alignments containing 3000 sequences and 3904 columns.

  3. Framework for the analysis of crystallization operations

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; Abdul Samad, Noor Asma Fazli Bin; Gernaey, Krist

    Crystallization is often applied in the production of salts and/oractive pharmaceutical ingredients (API), and the crystallization step is an essential part of the manufacturing process for many chemicals-based products.In recent years the monitoring and analysis of crystallization operations has...

  4. Analysis of computer programming languages

    International Nuclear Information System (INIS)

    Risset, Claude Alain

    1967-01-01

    This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language

  5. A Framework for Collaborative Networked Learning in Higher Education: Design & Analysis

    Directory of Open Access Journals (Sweden)

    Ghassan F. Issa

    2014-06-01

    Full Text Available This paper presents a comprehensive framework for building collaborative learning networks within higher educational institutions. This framework focuses on systems design and implementation issues in addition to a complete set of evaluation, and analysis tools. The objective of this project is to improve the standards of higher education in Jordan through the implementation of transparent, collaborative, innovative, and modern quality educational programs. The framework highlights the major steps required to plan, design, and implement collaborative learning systems. Several issues are discussed such as unification of courses and program of studies, using appropriate learning management system, software design development using Agile methodology, infrastructure design, access issues, proprietary data storage, and social network analysis (SNA techniques.

  6. A framework for intelligent reliability centered maintenance analysis

    International Nuclear Information System (INIS)

    Cheng Zhonghua; Jia Xisheng; Gao Ping; Wu Su; Wang Jianzhao

    2008-01-01

    To improve the efficiency of reliability-centered maintenance (RCM) analysis, case-based reasoning (CBR), as a kind of artificial intelligence (AI) technology, was successfully introduced into RCM analysis process, and a framework for intelligent RCM analysis (IRCMA) was studied. The idea for IRCMA is based on the fact that the historical records of RCM analysis on similar items can be referenced and used for the current RCM analysis of a new item. Because many common or similar items may exist in the analyzed equipment, the repeated tasks of RCM analysis can be considerably simplified or avoided by revising the similar cases in conducting RCM analysis. Based on the previous theory studies, an intelligent RCM analysis system (IRCMAS) prototype was developed. This research has focused on the description of the definition, basic principles as well as a framework of IRCMA, and discussion of critical techniques in the IRCMA. Finally, IRCMAS prototype is presented based on a case study

  7. XML Graphs in Program Analysis

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael I.

    2011-01-01

    XML graphs have shown to be a simple and effective formalism for representing sets of XML documents in program analysis. It has evolved through a six year period with variants tailored for a range of applications. We present a unified definition, outline the key properties including validation...... of XML graphs against different XML schema languages, and provide a software package that enables others to make use of these ideas. We also survey the use of XML graphs for program analysis with four very different languages: XACT (XML in Java), Java Servlets (Web application programming), XSugar...... (transformations between XML and non-XML data), and XSLT (stylesheets for transforming XML documents)....

  8. STATIC CODE ANALYSIS FOR SOFTWARE QUALITY IMPROVEMENT: A CASE STUDY IN BCI FRAMEWORK DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Indar Sugiarto

    2008-01-01

    Full Text Available This paper shows how the systematic approach in software testing using static code analysis method can be used for improving the software quality of a BCI framework. The method is best performed during the development phase of framework programs. In the proposed approach, we evaluate several software metrics which are based on the principles of object oriented design. Since such method is depending on the underlying programming language, we describe the method in term of C++ language programming whereas the Qt platform is also currently being used. One of the most important metric is so called software complexity. Applying the software complexity calculation using both McCabe and Halstead method for the BCI framework which consists of two important types of BCI, those are SSVEP and P300, we found that there are two classes in the framework which have very complex and prone to violation of cohesion principle in OOP. The other metrics are fit the criteria of the proposed framework aspects, such as: MPC is less than 20; average complexity is around value of 5; and the maximum depth is below 10 blocks. Such variables are considered very important when further developing the BCI framework in the future.

  9. A framework for selecting performance measures for opioid treatment programs.

    Science.gov (United States)

    Pelletier, Luc R; Hoffman, Jeffrey A

    2002-01-01

    As a result of new federal regulations released in early 2001 that move the monitoring and evaluation of opioid treatment programs from a government regulation to an accreditation model, program staff members are now being challenged to develop performance measurement systems that improve care and service. Using measurement selection criteria is the first step in developing a performance measurement system as a component of an overall quality management (QM) strategy. Opioid treatment programs can "leapfrog" the development of such systems by using lessons learned from the healthcare quality industry. This article reviews performance measurement definitions, proposes performance measurement selection criteria, and makes a business case for Internet automation and accessibility. Performance measurement sets that are appropriate for opioid treatment programs are proposed, followed by a discussion on how performance measurement can be used within a comprehensive QM program. It is hoped that through development, adoption, and implementation of such a performance measurement program, treatment for clients and their families will continuously improve.

  10. ANALYSIS FRAMEWORKS OF THE COLLABORATIVE INNOVATION PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Dan SERGHIE

    2014-12-01

    Full Text Available Time management is one of the resources by which we can achieve improved performance innovation. This perspective of resource management and process efficiency by reducing the timing of incubation of ideas, selecting profitable innovations and turning them into added value relates to that absolute time, a time specific to human existence. In this article I will try to prove that the main way to obtain high performance through inter-organizational innovation can be achieved by manipulating the context and manipulating knowledge outside the arbitrary concept for “time”. This article presents the results of the research suggesting a sequential analysis and evaluation model of the performance through a rational and refined process of selection of the performance indicators, aiming at providing the shortest and most relevant list of criteria.

  11. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  12. Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course

    Science.gov (United States)

    McGowan, Ian S.

    2016-01-01

    Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…

  13. A Framework for Constraint-Programming based Configuration

    DEFF Research Database (Denmark)

    Queva, Matthieu Stéphane Benoit

    Product configuration systems play an important role in the development of Mass Customisation, allowing the companies to reduce their costs while offering highly customised products. Such systems are often based on a configuration model, representing the product knowledge necessary to perform...... and efficient algorithms to solve the dependencies of the models at runtime. In this dissertation, we present a constraint-based framework for configuration. The design of this framework is partly based on a study of product configuration requirements as well as a comparison of several general modelling......CoLa language is given and used to verify and analyse the configuration models. Another goal of this dissertation is to describe the semantics of ProCoLa by providing a translation to a Constraint Satisfaction Problem (CSP) representation. For that purpose, several CSP formalisms are discussed and a new...

  14. On the C++ Object Programming for Time Series, in the Linux framework

    OpenAIRE

    Mateescu, George Daniel

    2013-01-01

    We study the implementation of time series trough C++ classes, using the fundamentals of C++ programming language, in the Linux framework. Such an implementation may be useful in time series modelling.

  15. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  16. The LTS timing analysis program :

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, Darrell Jewell; Schwarz, Jens

    2013-08-01

    The LTS Timing Analysis program described in this report uses signals from the Tempest Lasers, Pulse Forming Lines, and Laser Spark Detectors to carry out calculations to quantify and monitor the performance of the the Z-Accelerators laser triggered SF6 switches. The program analyzes Z-shots beginning with Z2457, when Laser Spark Detector data became available for all lines.

  17. Matlab programming for numerical analysis

    CERN Document Server

    Lopez, Cesar

    2014-01-01

    MATLAB is a high-level language and environment for numerical computation, visualization, and programming. Using MATLAB, you can analyze data, develop algorithms, and create models and applications. The language, tools, and built-in math functions enable you to explore multiple approaches and reach a solution faster than with spreadsheets or traditional programming languages, such as C/C++ or Java. Programming MATLAB for Numerical Analysis introduces you to the MATLAB language with practical hands-on instructions and results, allowing you to quickly achieve your goals. You will first become

  18. R data analysis without programming

    CERN Document Server

    Gerbing, David W

    2013-01-01

    This book prepares readers to analyze data and interpret statistical results using R more quickly than other texts. R is a challenging program to learn because code must be created to get started. To alleviate that challenge, Professor Gerbing developed lessR. LessR extensions remove the need to program. By introducing R through less R, readers learn how to organize data for analysis, read the data into R, and produce output without performing numerous functions and programming exercises first. With lessR, readers can select the necessary procedure and change the relevant variables without pro

  19. Mississippi Curriculum Framework for Fashion Marketing Technology (Program CIP: 08.0101--Apparel and Accessories Mkt. Op., Gen.). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the fashion marketing technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…

  20. Mississippi Curriculum Framework for Ophthalmic Technology (Program CIP: 51.1801--Opticianry/Dispensing Optician). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the ophthalmic technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and section II…

  1. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  2. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  3. A comparative analysis of protected area planning and management frameworks

    Science.gov (United States)

    Per Nilsen; Grant Tayler

    1997-01-01

    A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...

  4. A Unified Framework for Monetary Theory and Policy Analysis.

    Science.gov (United States)

    Lagos, Ricardo; Wright, Randall

    2005-01-01

    Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…

  5. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  6. Agricultural Value Chains in Developing Countries; a Framework for Analysis

    NARCIS (Netherlands)

    Trienekens, J.H.

    2011-01-01

    The paper presents a framework for developing country value chain analysis made up of three components. The first consists of identifying major constraints for value chain upgrading: market access restrictions, weak infrastructures, lacking resources and institutional voids. In the second component

  7. Transactional Analysis: Conceptualizing a Framework for Illuminating Human Experience

    Directory of Open Access Journals (Sweden)

    Trevor Thomas Stewart PhD

    2011-09-01

    Full Text Available Myriad methods exist for analyzing qualitative data. It is, however, imperative for qualitative researchers to employ data analysis tools that are congruent with the theoretical frameworks underpinning their inquiries. In this paper, I have constructed a framework for analyzing data that could be useful for researchers interested in focusing on the transactional nature of language as they engage in Social Science research. Transactional Analysis (TA is an inductive approach to data analysis that transcends constant comparative methods of exploring data. Drawing on elements of narrative and thematic analysis, TA uses the theories of Bakhtin and Rosenblatt to attend to the dynamic processes researchers identify as they generate themes in their data and seek to understand how their participants' worldviews are being shaped. This paper highlights the processes researchers can utilize to study the mutual shaping that occurs as participants read and enter into dialogue with the world around them.

  8. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  9. Programming-Languages as a Conceptual Framework for Teaching Mathematics

    Science.gov (United States)

    Feurzeig, Wallace; Papert, Seymour A.

    2011-01-01

    Formal mathematical methods remain, for most high school students, mysterious, artificial and not a part of their regular intuitive thinking. The authors develop some themes that could lead to a radically new approach. According to this thesis, the teaching of programming languages as a regular part of academic progress can contribute effectively…

  10. A Practical Ontology Framework for Static Model Analysis

    Science.gov (United States)

    2011-04-26

    throughout the model. We implement our analysis framework on top of Ptolemy II [3], an extensible open source model-based design tool written in Java...While Ptolemy II makes a good testbed for im- plementing and experimenting with new analyses, we also feel that the techniques we present here are...broadly use- ful. For this reason, we aim to make our analysis frame- work orthogonal to the execution semantics of Ptolemy II, allowing it to be

  11. Object-oriented data analysis framework for neutron scattering experiments

    International Nuclear Information System (INIS)

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  12. A Probabilistic Analysis Framework for Malicious Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Kammuller, Florian; Nemli, Ibrahim

    2015-01-01

    Malicious insider threats are difficult to detect and to mitigate. Many approaches for explaining behaviour exist, but there is little work to relate them to formal approaches to insider threat detection. In this work we present a general formal framework to perform analysis for malicious insider...... threats, based on probabilistic modelling, verification, and synthesis techniques. The framework first identifies insiders’ intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using probabilistic...

  13. MOOC Success Factors: Proposal of an Analysis Framework

    Directory of Open Access Journals (Sweden)

    Margarida M. Marques

    2017-10-01

    Full Text Available Aim/Purpose: From an idea of lifelong-learning-for-all to a phenomenon affecting higher education, Massive Open Online Courses (MOOCs can be the next step to a truly universal education. Indeed, MOOC enrolment rates can be astoundingly high; still, their completion rates are frequently disappointingly low. Nevertheless, as courses, the participants’ enrolment and learning within the MOOCs must be considered when assessing their success. In this paper, the authors’ aim is to reflect on what makes a MOOC successful to propose an analysis framework of MOOC success factors. Background: A literature review was conducted to identify reported MOOC success factors and to propose an analysis framework. Methodology: This literature-based framework was tested against data of a specific MOOC and refined, within a qualitative interpretivist methodology. The data were collected from the ‘As alterações climáticas nos média escolares - Clima@EduMedia’ course, which was developed by the project Clima@EduMedia and was submitted to content analysis. This MOOC aimed to support science and school media teachers in the use of media to teach climate change Contribution: By proposing a MOOC success factors framework the authors are attempting to contribute to fill in a literature gap regarding what concerns criteria to consider a specific MOOC successful. Findings: This work major finding is a literature-based and empirically-refined MOOC success factors analysis framework. Recommendations for Practitioners: The proposed framework is also a set of best practices relevant to MOOC developers, particularly when targeting teachers as potential participants. Recommendation for Researchers: This work’s relevance is also based on its contribution to increasing empirical research on MOOCs. Impact on Society: By providing a proposal of a framework on factors to make a MOOC successful, the authors hope to contribute to the quality of MOOCs. Future Research: Future

  14. A framework for evaluating and designing citizen science programs for natural resources monitoring.

    Science.gov (United States)

    Chase, Sarah K; Levine, Arielle

    2016-06-01

    We present a framework of resource characteristics critical to the design and assessment of citizen science programs that monitor natural resources. To develop the framework we reviewed 52 citizen science programs that monitored a wide range of resources and provided insights into what resource characteristics are most conducive to developing citizen science programs and how resource characteristics may constrain the use or growth of these programs. We focused on 4 types of resource characteristics: biophysical and geographical, management and monitoring, public awareness and knowledge, and social and cultural characteristics. We applied the framework to 2 programs, the Tucson (U.S.A.) Bird Count and the Maui (U.S.A.) Great Whale Count. We found that resource characteristics such as accessibility, diverse institutional involvement in resource management, and social or cultural importance of the resource affected program endurance and success. However, the relative influence of each characteristic was in turn affected by goals of the citizen science programs. Although the goals of public engagement and education sometimes complimented the goal of collecting reliable data, in many cases trade-offs must be made between these 2 goals. Program goals and priorities ultimately dictate the design of citizen science programs, but for a program to endure and successfully meet its goals, program managers must consider the diverse ways that the nature of the resource being monitored influences public participation in monitoring. © 2016 Society for Conservation Biology.

  15. Planetary protection in the framework of the Aurora exploration program

    Science.gov (United States)

    Kminek, G.

    The Aurora Exploration Program will give ESA new responsibilities in the field of planetary protection. Until now, ESA had only limited exposure to planetary protection from its own missions. With the proposed ExoMars and MSR missions, however, ESA will enter the realm of the highest planetary protection categories. As a consequence, the Aurora Exploration Program has initiated a number of activities in the field of planetary protection. The first and most important step was to establish a Planetary Protection Working Group (PPWG) that is advising the Exploration Program Advisory Committee (EPAC) on all matters concerning planetary protection. The main task of the PPWG is to provide recommendations regarding: Planetary protection for robotic missions to Mars; Planetary protection for a potential human mission to Mars; Review/evaluate standards & procedures for planetary protection; Identify research needs in the field of planetary protection. As a result of the PPWG deliberations, a number of activities have been initiated: Evaluation of the Microbial Diversity in SC Facilities; Working paper on legal issues of planetary protection and astrobiology; Feasibility study on a Mars Sample Return Containment Facility; Research activities on sterilization procedures; Training course on planetary protection (May, 2004); Workshop on sterilization techniques (fall 2004). In parallel to the PPWG, the Aurora Exploration Program has established an Ethical Working Group (EWG). This working group will address ethical issues related to astrobiology, planetary protection, and manned interplanetary missions. The recommendations of the working groups and the results of the R&D activities form the basis for defining planetary protection specification for Aurora mission studies, and for proposing modification and new inputs to the COSPAR planetary protection policy. Close cooperation and free exchange of relevant information with the NASA planetary protection program is strongly

  16. Framework for Interactive Parallel Dataset Analysis on the Grid

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, David A.; Ananthan, Balamurali; /Tech-X Corp.; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  17. Mississippi Curriculum Framework for Physical Therapist Assistant (CIP: 51.0806--Physical Therapy Assistant). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the physical therapy assistant program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies, and section…

  18. Developing Global Standards Framework and Quality Integrated Models for Cooperative and Work-Integrated Education Programs

    Science.gov (United States)

    Khampirat, Buratin; McRae, Norah

    2016-01-01

    Cooperative and Work-integrated Education (CWIE) programs have been widely accepted as educational programs that can effectively connect what students are learning to the world of work through placements. Because a global quality standards framework could be a very valuable resource and guide to establishing, developing, and accrediting quality…

  19. The Data-to-Action Framework: A Rapid Program Improvement Process

    Science.gov (United States)

    Zakocs, Ronda; Hill, Jessica A.; Brown, Pamela; Wheaton, Jocelyn; Freire, Kimberley E.

    2015-01-01

    Although health education programs may benefit from quality improvement methods, scant resources exist to help practitioners apply these methods for program improvement. The purpose of this article is to describe the Data-to-Action framework, a process that guides practitioners through rapid-feedback cycles in order to generate actionable data to…

  20. pygrametl: A Powerful Programming Framework for Extract–Transform–Load Programmers

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    and operations. In this paper, we challenge this approach and propose to do ETL programming by writing code. To make the programming easy, we present the (Python-based) framework pygrametl which offers commonly used functionality for ETL development. By using the framework, the developer can efficiently create...... effective ETL solutions from which the full power of programming can be exploited. Our experiments show that when pygrametl is used, both the development time and running time are short when compared to an existing GUI-based tool...

  1. Framework for assessing causality in disease management programs: principles.

    Science.gov (United States)

    Wilson, Thomas; MacDowell, Martin

    2003-01-01

    To credibly state that a disease management (DM) program "caused" a specific outcome it is required that metrics observed in the DM population be compared with metrics that would have been expected in the absence of a DM intervention. That requirement can be very difficult to achieve, and epidemiologists and others have developed guiding principles of causality by which credible estimates of DM impact can be made. This paper introduces those key principles. First, DM program metrics must be compared with metrics from a "reference population." This population should be "equivalent" to the DM intervention population on all factors that could independently impact the outcome. In addition, the metrics used in both groups should use the same defining criteria (ie, they must be "comparable" to each other). The degree to which these populations fulfill the "equivalent" assumption and metrics fulfill the "comparability" assumption should be stated. Second, when "equivalence" or "comparability" is not achieved, the DM managers should acknowledge this fact and, where possible, "control" for those factors that may impact the outcome(s). Finally, it is highly unlikely that one study will provide definitive proof of any specific DM program value for all time; thus, we strongly recommend that studies be ongoing, at multiple points in time, and at multiple sites, and, when observational study designs are employed, that more than one type of study design be utilized. Methodologically sophisticated studies that follow these "principles of causality" will greatly enhance the reputation of the important and growing efforts in DM.

  2. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    Directory of Open Access Journals (Sweden)

    Paweł Sitek

    2016-01-01

    Full Text Available This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs and constraint optimization problems (COPs. Two paradigms, CLP (constraint logic programming and MP (mathematical programming, are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework automatically generates CSP and COP models based on current values of data instances, questions asked by a user, and set of predicates and facts of the problem being modeled, which altogether constitute a knowledge database for the given problem. This dynamic generation of dedicated models, based on the knowledge base, together with the parameters changing externally, for example, the user’s questions, is the implementation of the autonomous search concept. The models are solved using the internal or external solvers integrated with the framework. The architecture of the framework as well as its implementation outline is also included in the paper. The effectiveness of the framework regarding the modeling and solution search is assessed through the illustrative examples relating to scheduling problems with additional constrained resources.

  3. A programming framework for data streaming on the Xeon Phi

    Science.gov (United States)

    Chapeland, S.; ALICE Collaboration

    2017-10-01

    ALICE (A Large Ion Collider Experiment) is the dedicated heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). After the second long shut-down of the LHC, the ALICE detector will be upgraded to cope with an interaction rate of 50 kHz in Pb-Pb collisions, producing in the online computing system (O2) a sustained throughput of 3.4 TB/s. This data will be processed on the fly so that the stream to permanent storage does not exceed 90 GB/s peak, the raw data being discarded. In the context of assessing different computing platforms for the O2 system, we have developed a framework for the Intel Xeon Phi processors (MIC). It provides the components to build a processing pipeline streaming the data from the PC memory to a pool of permanent threads running on the MIC, and back to the host after processing. It is based on explicit offloading mechanisms (data transfer, asynchronous tasks) and basic building blocks (FIFOs, memory pools, C++11 threads). The user only needs to implement the processing method to be run on the MIC. We present in this paper the architecture, implementation, and performance of this system.

  4. Storing Clocked Programs Inside DNA A Simplifying Framework for Nanocomputing

    CERN Document Server

    Chang, Jessica

    2011-01-01

    In the history of modern computation, large mechanical calculators preceded computers. A person would sit there punching keys according to a procedure and a number would eventually appear. Once calculators became fast enough, it became obvious that the critical path was the punching rather than the calculation itself. That is what made the stored program concept vital to further progress. Once the instructions were stored in the machine, the entire computation could run at the speed of the machine. This book shows how to do the same thing for DNA computing. Rather than asking a robot or a pers

  5. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seyong [ORNL; Vetter, Jeffrey S [ORNL

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing and implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.

  6. The image of psychology programs: the value of the instrumental-symbolic framework

    OpenAIRE

    Van Hoye, Greet; Lievens, Filip; De Soete, Britt; Libbrecht, Nele; Schollaert, Eveline; Baligant, Dimphna

    2014-01-01

    As competition for funding and students intensifies, it becomes increasingly important for psychology programs to have an image that is attractive and makes them stand out from other programs. We use the instrumental-symbolic framework from the marketing domain to determine the image of different master’s programs in psychology and examine how these image dimensions relate to student attraction and competitor differentiation. The samples consist of both potential students (N = 114) and curren...

  7. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  8. The radiation protection research within the fourth Framework Program of the European Union (1994-1998)

    International Nuclear Information System (INIS)

    Siunaeve, J.; Mingot, F.; Arranz, L.; Cancio, D.

    1995-01-01

    The next research program on Radiation Protection within the Fourth Framework Program of the European Union has been approved by the Council last December (O.I.N L 361, 12/31/94). The program includes important changes in its structure as well as in the way for implementation in Europe. The most important change is that the main activities concerning Nuclear Safety, Waste Management and Radiation Protection have been included in a single program called Nuclear Fission Safety. The program also includes specific work with CIS countries for the management of Chernobyl consequences as well as other significative contaminations in other areas of the former Soviet Union. (Author)

  9. Stochastic programming framework for Lithuanian pension payout modelling

    Directory of Open Access Journals (Sweden)

    Audrius Kabašinskas

    2014-12-01

    Full Text Available The paper provides a scientific approach to the problem of selecting a pension fund by taking into account some specific characteristics of the Lithuanian Republic (LR pension accumulation system. The decision making model, which can be used to plan a long-term pension accrual of the Lithuanian Republic (LR citizens, in an optimal way is presented. This model focuses on factors that influence the sustainability of the pension system selection under macroeconomic, social and demographic uncertainty. The model is formalized as a single stage stochastic optimization problem where the long-term optimal strategy can be obtained based on the possible scenarios generated for a particular participant. Stochastic programming methods allow including the pension fund rebalancing moment and direction of investment, and taking into account possible changes of personal income, changes of society and the global financial market. The collection of methods used to generate scenario trees was found useful to solve strategic planning problems.

  10. A framework for evaluating OSH program effectiveness using leading and trailing metrics.

    Science.gov (United States)

    Wurzelbacher, Steve; Jin, Yan

    2011-06-01

    Many employers and regulators today rely primarily on a few past injury/ illness metrics as criteria for rating the effectiveness of occupational safety and health (OSH) programs. Although such trailing data are necessary to assess program success, they may not be sufficient for developing proactive safety, ergonomic, and medical management plans. The goals of this pilot study were to create leading metrics (company self-assessment ratings) and trailing metrics (past loss data) that could be used to evaluate the effectiveness of OSH program elements that range from primary to tertiary prevention. The main hypothesis was that the new metrics would be explanatory variables for three standard future workers compensation (WC) outcomes in 2003 (rates of total cases, lost time cases, and costs) and that the framework for evaluating OSH programs could be justifiably expanded. For leading metrics, surveys were developed to allow respondents to assess OSH exposures and program prevention elements (management leadership/ commitment, employee participation, hazard identification, hazard control, medical management, training, and program evaluation). After pre-testing, surveys were sent to companies covered by the same WC insurer in early 2003. A total of 33 completed surveys were used for final analysis. A series of trailing metrics were developed from 1999-2001 WC data for the surveyed companies. Data were analyzed using a method where each main 2003 WC outcome was dichotomized into high and low loss groups based on the median value of the variable. The mean and standard deviations of survey questions and 1999-2001 WC variables were compared between the dichotomized groups. Hypothesis testing was performed using F-test with a significance level 0.10. Companies that exhibited higher musculoskeletal disorder (MSD) WC case rates from 1999-2001 had higher total WC case rates in 2003. Higher levels of several self-reported OSH program elements (tracking progress in controlling

  11. Agricultural Value Chains in Developing Countries A Framework for Analysis

    OpenAIRE

    Trienekens, Jacques H.

    2011-01-01

    The paper presents a framework for developing country value chain analysis made up of three components. The first consists of identifying major constraints for value chain upgrading: market access restrictions, weak infrastructures, lacking resources and institutional voids. In the second component three elements of a value chain are defined: value addition, horizontal and vertical chain-network structure and value chain governance mechanisms. Finally, upgrading options are defined in the are...

  12. Multi-order analysis framework for comprehensive biometric performance evaluation

    Science.gov (United States)

    Gorodnichy, Dmitry O.

    2010-04-01

    It is not uncommon for contemporary biometric systems to have more than one match below the matching threshold, or to have two or more matches having close matching scores. This is especially true for those that store large quantities of identities and/or are applied to measure loosely constrained biometric traits, such as in identification from video or at a distance. Current biometric performance evaluation standards however are still largely based on measuring single-score statistics such as False Match, False Non-Match rates and the trade-off curves based thereon. Such methodology and reporting makes it impossible to investigate the risks and risk mitigation strategies associated with not having a unique identifying score. To address the issue, Canada Border Services Agency has developed a novel modality-agnostic multi-order performance analysis framework. The framework allows one to analyze the system performance at several levels of detail, by defining the traditional single-score-based metrics as Order-1 analysis, and introducing Order- 2 and Order-3 analysis to permit the investigation of the system reliability and the confidence of its recognition decisions. Implemented in a toolkit called C-BET (Comprehensive Biometrics Evaluation Toolkit), the framework has been applied in a recent examination of the state-of-the art iris recognition systems, the results of which are presented, and is now recommended to other agencies interested in testing and tuning the biometric systems.

  13. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  14. A new kernel discriminant analysis framework for electronic nose recognition

    International Nuclear Information System (INIS)

    Zhang, Lei; Tian, Feng-Chun

    2014-01-01

    Graphical abstract: - Highlights: • This paper proposes a new discriminant analysis framework for feature extraction and recognition. • The principle of the proposed NDA is derived mathematically. • The NDA framework is coupled with kernel PCA for classification. • The proposed KNDA is compared with state of the art e-Nose recognition methods. • The proposed KNDA shows the best performance in e-Nose experiments. - Abstract: Electronic nose (e-Nose) technology based on metal oxide semiconductor gas sensor array is widely studied for detection of gas components. This paper proposes a new discriminant analysis framework (NDA) for dimension reduction and e-Nose recognition. In a NDA, the between-class and the within-class Laplacian scatter matrix are designed from sample to sample, respectively, to characterize the between-class separability and the within-class compactness by seeking for discriminant matrix to simultaneously maximize the between-class Laplacian scatter and minimize the within-class Laplacian scatter. In terms of the linear separability in high dimensional kernel mapping space and the dimension reduction of principal component analysis (PCA), an effective kernel PCA plus NDA method (KNDA) is proposed for rapid detection of gas mixture components by an e-Nose. The NDA framework is derived in this paper as well as the specific implementations of the proposed KNDA method in training and recognition process. The KNDA is examined on the e-Nose datasets of six kinds of gas components, and compared with state of the art e-Nose classification methods. Experimental results demonstrate that the proposed KNDA method shows the best performance with average recognition rate and total recognition rate as 94.14% and 95.06% which leads to a promising feature extraction and multi-class recognition in e-Nose

  15. Interactive Safety Analysis Framework of Autonomous Intelligent Vehicles

    Directory of Open Access Journals (Sweden)

    Cui You Xiang

    2016-01-01

    Full Text Available More than 100,000 people were killed and around 2.6 million injured in road accidents in the People’s Republic of China (PRC, that is four to eight times that of developed countries, equivalent to 6.2 mortality per 10 thousand vehicles—the highest rate in the world. There are more than 1,700 fatalities and 840,000 injuries yearly due to vehicle crashes off public highways. In this paper, we proposed a interactive safety situation and threat analysis framework based on driver behaviour and vehicle dynamics risk analysis based on ISO26262…

  16. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    Science.gov (United States)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  17. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 2; Appendices

    Science.gov (United States)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The appendices to the original report are contained in this document.

  18. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 1

    Science.gov (United States)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The findings of the assessment are contained in this report.

  19. NET-2 Network Analysis Program

    International Nuclear Information System (INIS)

    Malmberg, A.F.

    1974-01-01

    The NET-2 Network Analysis Program is a general purpose digital computer program which solves the nonlinear time domain response and the linearized small signal frequency domain response of an arbitrary network of interconnected components. NET-2 is capable of handling a variety of components and has been applied to problems in several engineering fields, including electronic circuit design and analysis, missile flight simulation, control systems, heat flow, fluid flow, mechanical systems, structural dynamics, digital logic, communications network design, solid state device physics, fluidic systems, and nuclear vulnerability due to blast, thermal, gamma radiation, neutron damage, and EMP effects. Network components may be selected from a repertoire of built-in models or they may be constructed by the user through appropriate combinations of mathematical, empirical, and topological functions. Higher-level components may be defined by subnetworks composed of any combination of user-defined components and built-in models. The program provides a modeling capability to represent and intermix system components on many levels, e.g., from hole and electron spatial charge distributions in solid state devices through discrete and integrated electronic components to functional system blocks. NET-2 is capable of simultaneous computation in both the time and frequency domain, and has statistical and optimization capability. Network topology may be controlled as a function of the network solution. (U.S.)

  20. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    and by monitoring their behavior, then generate data for malware detection signature and for developing their counter measure. 15. SUBJECT TERMS...FA2386-15-1-4068 Keiji Takeda, Keio University keiji@sfc.keio.ac.jp 1 Objective This research was conducted to develop components for automated...binary program and by monitoring their behavior, then generate data for malware detection signature and for developing their counter measure. 2

  1. The PandaRoot framework for simulation, reconstruction and analysis

    International Nuclear Information System (INIS)

    Spataro, Stefano

    2011-01-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  2. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  3. The Boston Health Care for the Homeless Program: A Public Health Framework

    Science.gov (United States)

    Oppenheimer, Sarah C.; Judge, Christine M.; Taube, Robert L.; Blanchfield, Bonnie B.; Swain, Stacy E.; Koh, Howard K.

    2010-01-01

    During the past 25 years, the Boston Health Care for the Homeless Program has evolved into a service model embodying the core functions and essential services of public health. Each year the program provides integrated medical, behavioral, and oral health care, as well as preventive services, to more than 11 000 homeless people. Services are delivered in clinics located in 2 teaching hospitals, 80 shelters and soup kitchens, and an innovative 104-bed medical respite unit. We explain the program's principles of care, describe the public health framework that undergirds the program, and offer lessons for the elimination of health disparities suffered by this vulnerable population. PMID:20558804

  4. Institutional Analysis and Ecosystem-Based Management: The Institutional Analysis and Development Framework.

    Science.gov (United States)

    Imperial

    1999-11-01

    / Scholars, government practitioners, and environmentalists are increasingly supportive of collaborative, ecosystem-based approaches to natural resource management. However, few researchers have focused their attention on examining the important administrative and institutional challenges surrounding ecosystem-based management. This paper describes how the institutional analysis and development (IAD) framework can be used to better understand the institutional arrangements used to implement ecosystem-based management programs. Some of the observations emanating from previous research on institutional design and performance are also discussed. The paper's central argument is that if this new resource management paradigm is to take hold and flourish, researchers and practitioners must pay closer attention to the questions surrounding institutional design and performance. This should help improve our understanding of the relationship between science and human values in decision making. It should also help researchers avoid making faulty policy recommendations and improve the implementation of ecosystem-based management programs.KEY WORDS: Ecosystem management; Watershed management; Common pool resources; Implementation; Institutional analysis; Evaluation; Policy analysishttp://link.springer-ny.com/link/service/journals/00267/bibs/24n4p449.html

  5. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  6. Considerations on a methodological framework for the analysis of texts

    Directory of Open Access Journals (Sweden)

    David Andrés Camargo Mayorga

    2017-03-01

    Full Text Available This article presents a review of relevant literature for the construction of a methodological framework for the analysis of texts in applied social sciences, such as economics, which we have supported in the main hermeneutical approaches from philosophy, linguistics and social sciences. In essence, they assume that every discourse carries meaning - be it truthful or not - and that they express complex social relations. Thus, any analysis of content happens finally to be a certain type of hermeneutics (interpretation, while trying to account for multiple phenomena immersed in the production, application, use and reproduction of knowledge within the text. When applying discourse analysis in teaching texts in economic sciences, we find traces of legalistic, political, ethnocentric tendencies, among other discourses hidden from the text. For this reason, the analysis of the internal discourse of the text allows us to delve inside the state ideology and its underlying or latent discourses.

  7. Planetary Protection Bioburden Analysis Program

    Science.gov (United States)

    Beaudet, Robert A.

    2013-01-01

    This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous tools that report the data in various ways to simplify the reports required. The program performs all the calculations directly in the MS Access program. Prior to this development, the data was exported to large Excel files that had to be cut and pasted to provide the desired results. The program contains a main menu and a number of submenus. Analyses can be performed by using either all the assays, or only the accountable assays that will be used in the final analysis. There are three options on the first menu: either calculate using (1) the old MER (Mars Exploration Rover) statistics, (2) the MSL statistics for all the assays, or This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software

  8. Effects of donor proliferation in development aid for health on health program performance: A conceptual framework.

    Science.gov (United States)

    Pallas, Sarah Wood; Ruger, Jennifer Prah

    2017-02-01

    Development aid for health increased dramatically during the past two decades, raising concerns about inefficiency and lack of coherence among the growing number of global health donors. However, we lack a framework for how donor proliferation affects health program performance to inform theory-based evaluation of aid effectiveness policies. A review of academic and gray literature was conducted. Data were extracted from the literature sample on study design and evidence for hypothesized effects of donor proliferation on health program performance, which were iteratively grouped into categories and mapped into a new conceptual framework. In the framework, increases in the number of donors are hypothesized to increase inter-donor competition, transaction costs, donor poaching of recipient staff, recipient control over aid, and donor fragmentation, and to decrease donors' sense of accountability for overall development outcomes. There is mixed evidence on whether donor proliferation increases or decreases aid volume. These primary effects in turn affect donor innovation, information hoarding, and aid disbursement volatility, as well as recipient country health budget levels, human resource capacity, and corruption, and the determinants of health program performance. The net effect of donor proliferation on health will vary depending on the magnitude of the framework's competing effects in specific country settings. The conceptual framework provides a foundation for improving design of aid effectiveness practices to mitigate negative effects from donor proliferation while preserving its potential benefits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Conducting a SWOT Analysis for Program Improvement

    Science.gov (United States)

    Orr, Betsy

    2013-01-01

    A SWOT (strengths, weaknesses, opportunities, and threats) analysis of a teacher education program, or any program, can be the driving force for implementing change. A SWOT analysis is used to assist faculty in initiating meaningful change in a program and to use the data for program improvement. This tool is useful in any undergraduate or degree…

  10. Designing the framework for competency-based master of public health programs in India.

    Science.gov (United States)

    Sharma, Kavya; Zodpey, Sanjay; Morgan, Alison; Gaidhane, Abhay; Syed, Zahiruddin Quazi; Kumar, Rajeev

    2013-01-01

    Competency in the practice of public health is the implicit goal of education institutions that offer master of public health (MPH) programs. With the expanding number of institutions offering courses in public health in India, it is timely to develop a common framework to ensure that graduates are proficient in critical public health. Steps such as situation assessment, survey of public health care professionals in India, and national consultation were undertaken to develop a proposed competency-based framework for MPH programs in India. The existing curricula of all 23 Indian MPH courses vary significantly in content with regard to core, concentration, and crosscutting discipline areas and course durations. The competency or learning outcome is not well defined. The findings of the survey suggest that MPH graduates in India should have competencies ranging from monitoring of health problems and epidemics in the community, applying biostatistics in public health, conducting action research, understanding social and community influence on public health developing indicators and instruments to monitor and evaluate community health programs, developing proposals, and involving community in planning, delivery, and monitoring of health programs. Competency statements were framed and mapped with domains including epidemiology, biostatistics, social and behavioral sciences, health care system, policy, planning, and financing, and environmental health sciences and a crosscutting domain that include health communication and informatics, health management and leadership, professionalism, systems thinking, and public health biology. The proposed competency-based framework for Indian MPH programs can be adapted to meet the needs of diverse, unique programs. The framework ensures the uniqueness and diversity of individual MPH programs in India while contributing to measures of overall program success.

  11. XML Graphs in Program Analysis

    DEFF Research Database (Denmark)

    Møller, Anders; Schwartzbach, Michael Ignatieff

    2007-01-01

    XML graphs have shown to be a simple and effective formalism for representing sets of XML documents in program analysis. It has evolved through a six year period with variants tailored for a range of applications. We present a unified definition, outline the key properties including validation...... of XML graphs against different XML schema languages, and provide a software package that enables others to make use of these ideas. We also survey four very different applications: XML in Java, Java Servlets and JSP, transformations between XML and non-XML data, and XSLT....

  12. VisRseq: R-based visual framework for analysis of sequencing data.

    Science.gov (United States)

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  13. VisRseq: R-based visual framework for analysis of sequencing data

    Science.gov (United States)

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. Conclusions To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights. PMID:26328469

  14. Introduction of blended learning in a master program: Developing an integrative mixed method evaluation framework.

    Science.gov (United States)

    Chmiel, Aviva S; Shaha, Maya; Schneider, Daniel K

    2017-01-01

    The aim of this research is to develop a comprehensive evaluation framework involving all actors in a higher education blended learning (BL) program. BL evaluation usually either focuses on students, faculty, technological or institutional aspects. Currently, no validated comprehensive monitoring tool exists that can support introduction and further implementation of BL in a higher education context. Starting from established evaluation principles and standards, concepts that were to be evaluated were firstly identified and grouped. In a second step, related BL evaluation tools referring to students, faculty and institutional level were selected. This allowed setting up and implementing an evaluation framework to monitor the introduction of BL during two succeeding recurrences of the program. The results of the evaluation allowed documenting strengths and weaknesses of the BL format in a comprehensive way, involving all actors. It has led to improvements at program, faculty and course level. The evaluation process and the reporting of the results proved to be demanding in time and personal resources. The evaluation framework allows measuring the most significant dimensions influencing the success of a BL implementation at program level. However, this comprehensive evaluation is resource intensive. Further steps will be to refine the framework towards a sustainable and transferable BL monitoring tool that finds a balance between comprehensiveness and efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  16. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    International Nuclear Information System (INIS)

    Alioli, Simone; Nason, Paolo; Oleari, Carlo; Re, Emanuele

    2010-02-01

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  17. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  18. A Framework for Security Analysis of Mobile Wireless Networks

    DEFF Research Database (Denmark)

    Nanz, Sebastian; Hankin, Chris

    2006-01-01

    We present a framework for specification and security analysis of communication protocols for mobile wireless networks. This setting introduces new challenges which are not being addressed by classical protocol analysis techniques. The main complication stems from the fact that the actions...... processes and the network's connectivity graph, which may change independently from protocol actions. We identify a property characterising an important aspect of security in this setting and express it using behavioural equivalences of the calculus. We complement this approach with a control flow analysis...... of intermediate nodes and their connectivity can no longer be abstracted into a single unstructured adversarial environment as they form an inherent part of the system's security. In order to model this scenario faithfully, we present a broadcast calculus which makes a clear distinction between the protocol...

  19. Analysis and System Design Framework for Infrared Spatial Heterodyne Spectrometers

    Energy Technology Data Exchange (ETDEWEB)

    Cooke, B.J.; Smith, B.W.; Laubscher, B.E.; Villeneuve, P.V.; Briles, S.D.

    1999-04-05

    The authors present a preliminary analysis and design framework developed for the evaluation and optimization of infrared, Imaging Spatial Heterodyne Spectrometer (SHS) electro-optic systems. Commensurate with conventional interferometric spectrometers, SHS modeling requires an integrated analysis environment for rigorous evaluation of system error propagation due to detection process, detection noise, system motion, retrieval algorithm and calibration algorithm. The analysis tools provide for optimization of critical system parameters and components including : (1) optical aperture, f-number, and spectral transmission, (2) SHS interferometer grating and Littrow parameters, and (3) image plane requirements as well as cold shield, optical filtering, and focal-plane dimensions, pixel dimensions and quantum efficiency, (4) SHS spatial and temporal sampling parameters, and (5) retrieval and calibration algorithm issues.

  20. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  1. A framework for sensitivity analysis of decision trees.

    Science.gov (United States)

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  2. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  3. A Framework for Sustainable Design of Algal Biorefineries: Economic Aspects and Life Cycle Analysis

    DEFF Research Database (Denmark)

    Cheali, Peam; Loureiro da Costa Lira Gargalo, Carina; Gernaey, Krist

    2015-01-01

    mathematically as a mixed integer nonlinear programming problem, and is solved first to identify the optimal designs with respect to economic optimality. These optimal designs are then analyzed further in terms of environmental performance using life cycle analysis. For sustainability analysis, in total five......In this chapter, a framework for sustainable design of algal biorefineries with respect to economic and environmental objectives is presented. As part of the framework, a superstructure is formulated to represent the design space – describing technologies developed for processing various types...... of algae feedstock for the production of biodiesel and co-products. Relevant data and parameters for each process such as yield, conversion, operational cost is then collected using a standardized format (a generic model) and stored in a database. The sustainable design problem is then formulated...

  4. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    International Nuclear Information System (INIS)

    Hartwig, Zachary S.

    2016-01-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms. - Highlights: • A new software framework for radiation detector data acquisition and analysis. • Integrated acquisition and analysis of real-world and simulated detector data. • C++ and Python libraries for data acquisition hardware control and readout. • Graphical program for control and readout of digital data acquisition hardware. • Graphical program for comprehensive analysis of real-world and simulated data.

  5. The image of psychology programs: the value of the instrumental-symbolic framework.

    Science.gov (United States)

    Van Hoye, Greet; Lievens, Filip; De Soete, Britt; Libbrecht, Nele; Schollaert, Eveline; Baligant, Dimphna

    2014-01-01

    As competition for funding and students intensifies, it becomes increasingly important for psychology programs to have an image that is attractive and makes them stand out from other programs. The current study uses the instrumental-symbolic framework from the marketing domain to determine the image of different master's programs in psychology and examines how these image dimensions relate to student attraction and competitor differentiation. The samples consist of both potential students (N = 114) and current students (N = 68) of three psychology programs at a Belgian university: industrial and organizational psychology, clinical psychology, and experimental psychology. The results demonstrate that both instrumental attributes (e.g., interpersonal activities) and symbolic trait inferences (e.g., sincerity) are key components of the image of psychology programs and predict attractiveness as well as differentiation. In addition, symbolic image dimensions seem more important for current students of psychology programs than for potential students.

  6. TomoPy: a framework for the analysis of synchrotron tomographic data

    International Nuclear Information System (INIS)

    Gürsoy, Doǧa; De Carlo, Francesco; Xiao, Xianghui; Jacobsen, Chris

    2014-01-01

    A collaborative framework for the analysis of synchrotron tomographic data which has the potential to unify the effort of different facilities and beamlines performing similar tasks is described. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports functional programming that many researchers prefer. Analysis of tomographic datasets at synchrotron light sources (including X-ray transmission tomography, X-ray fluorescence microscopy and X-ray diffraction tomography) is becoming progressively more challenging due to the increasing data acquisition rates that new technologies in X-ray sources and detectors enable. The next generation of synchrotron facilities that are currently under design or construction throughout the world will provide diffraction-limited X-ray sources and are expected to boost the current data rates by several orders of magnitude, stressing the need for the development and integration of efficient analysis tools. Here an attempt to provide a collaborative framework for the analysis of synchrotron tomographic data that has the potential to unify the effort of different facilities and beamlines performing similar tasks is described in detail. The proposed Python-based framework is open-source, platform- and data-format-independent, has multiprocessing capability and supports procedural programming that many researchers prefer. This collaborative platform could affect all major synchrotron facilities where new effort is now dedicated to developing new tools that can be deployed at the facility for real-time processing, as well as distributed to users for off-site data processing

  7. pygrametl: A Powerful Programming Framework for Extract-Transform-Load Programmers

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2009-01-01

    and operations. In this paper, we propose to do ETL programming by writing code to be more productive. To make the programming easy, we present the (Python-based) frame- work pygrametl which offers commonly used functionality for ETL development. By using the framework, the developer can ef- ficiently create...... effective ETL solutions from which the full power of programming can be realized. Our experiments show that when pygrametl is used, both the development time and running time are shorter than when a GUI-based tool is used....

  8. The levers of control framework : An exploratory analysis of balance

    NARCIS (Netherlands)

    Kruis, A.; Speklé, R.F.; Widener, S.

    2016-01-01

    The impact of the Levers of Control (LOC) framework on the accounting literature is undeniably large. The framework, however, has also been criticized for being vague and ambiguous. One of the central, but unclear, concepts in the LOC framework is the notion of balance. That is, the framework holds

  9. Using the Five Senses of Success framework to understand the experiences of midwifery students enroled in an undergraduate degree program.

    Science.gov (United States)

    Sidebotham, M; Fenwick, J; Carter, A; Gamble, J

    2015-01-01

    developing a student's sense of capability, purpose, resourcefulness, identity and connectedness (five-senses of success) are key factors that may be important in predicting student satisfaction and progression within their university program. the study aimed to examine the expectations and experiences of second and third year midwifery students enroled in a Bachelor of Midwifery program and identify barriers and enablers to success. a descriptive exploratory qualitative design was used. Fifty-six students enroled in either year 2 or 3 of the Bachelor of Midwifery program in SE Queensland participated in an anonymous survey using open-ended questions. In addition, 16 students participated in two year-level focus groups. Template analysis, using the Five Senses Framework, was used to analyse the data set. early exposure to 'hands on' clinical midwifery practice as well as continuity of care experiences provided students with an opportunity to link theory to practice and increased their perception of capability as they transitioned through the program. Students' sense of identity, purpose, resourcefulness, and capability was strongly influenced by the programs embedded meta-values, including a 'woman centred' approach. In addition, a student's ability to form strong positive relationships with women, peers, lecturers and supportive clinicians was central to developing connections and ultimately a sense of success. A sense of connection not only fostered an ongoing belief that challenges could be overcome but that students themselves could initiate or influence change. the five senses framework provided a useful lens through which to analyse the student experience. Key factors to student satisfaction and retention within a Bachelor of Midwifery program include: a clearly articulated midwifery philosophy, strategies to promote student connectedness including the use of social media, and further development of clinicians' skills in preceptorship, clinical teaching and

  10. User's manual for the Composite HTGR Analysis Program (CHAP-1)

    International Nuclear Information System (INIS)

    Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.; Wecksung, M.J.; Willcutt, G.J.E. Jr.

    1977-03-01

    CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework

  11. A Framework for Bioacoustic Vocalization Analysis Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Ebenezer Out-Nyarko

    2009-11-01

    Full Text Available Using Hidden Markov Models (HMMs as a recognition framework for automatic classification of animal vocalizations has a number of benefits, including the ability to handle duration variability through nonlinear time alignment, the ability to incorporate complex language or recognition constraints, and easy extendibility to continuous recognition and detection domains. In this work, we apply HMMs to several different species and bioacoustic tasks using generalized spectral features that can be easily adjusted across species and HMM network topologies suited to each task. This experimental work includes a simple call type classification task using one HMM per vocalization for repertoire analysis of Asian elephants, a language-constrained song recognition task using syllable models as base units for ortolan bunting vocalizations, and a stress stimulus differentiation task in poultry vocalizations using a non-sequential model via a one-state HMM with Gaussian mixtures. Results show strong performance across all tasks and illustrate the flexibility of the HMM framework for a variety of species, vocalization types, and analysis tasks.

  12. Evaluating Comprehensive State Tobacco Prevention and Control Programs Using an Outcome Indicator Framework.

    Science.gov (United States)

    Fulmer, Erika; Rogers, Todd; Glasgow, LaShawn; Brown, Susan; Kuiper, Nicole

    2018-03-01

    The outcome indicator framework helps tobacco prevention and control programs (TCPs) plan and implement theory-driven evaluations of their efforts to reduce and prevent tobacco use. Tobacco use is the single-most preventable cause of morbidity and mortality in the United States. The implementation of public health best practices by comprehensive state TCPs has been shown to prevent the initiation of tobacco use, reduce tobacco use prevalence, and decrease tobacco-related health care expenditures. Achieving and sustaining program goals require TCPs to evaluate the effectiveness and impact of their programs. To guide evaluation efforts by TCPs, the Centers for Disease Control and Prevention's Office on Smoking and Health developed an outcome indicator framework that includes a high-level logic model and evidence-based outcome indicators for each tobacco prevention and control goal area. In this article, we describe how TCPs and other community organizations can use the outcome indicator framework in their evaluation efforts. We also discuss how the framework is used at the national level to unify tobacco prevention and control efforts across varying state contexts, identify promising practices, and expand the public health evidence base.

  13. A hybrid Constraint Programming/Mixed Integer Programming framework for the preventive signaling maintenance crew scheduling problem

    DEFF Research Database (Denmark)

    Pour, Shahrzad M.; Drake, John H.; Ejlertsen, Lena Secher

    2017-01-01

    A railway signaling system is a complex and interdependent system which should ensure the safe operation of trains. We introduce and address a mixed integer optimisation model for the preventive signal maintenance crew scheduling problem in the Danish railway system. The problem contains many...... practical constraints, such as temporal dependencies between crew schedules, the splitting of tasks across multiple days, crew competency requirements and several other managerial constraints. We propose a novel hybrid framework using Constraint Programming (CP) to generate initial feasible solutions...

  14. Ion trajectory analysis program (ITAP)

    International Nuclear Information System (INIS)

    Youchison, D.L.; Nahemow, M.D.

    1991-01-01

    ITAP is a 2 1/2-dimensional FORTRAN code developed for the first-order design of charged-particle transport systems. The Ion Trajectory Analysis Program (ITAP) utilizes the paraxial-ray equation with no space charge to determine image size and divergence along the beam line. A discretized transfer-matrix technique is used to model particle transport through drift spaces, symmetrical electrostatic lenses, quadrupoles, wein filters, sector mass separators, deflection plates, and solenoids. Dispersion effects are also included for the prisms. ITAP contains an iterative design option which can determine the excitations, thicknesses, or gaps required in the model to produce either a desired size or divergence at the pseudo-image. Multiple elements may be designed in sequence permitting complete model optimization. Output consists of trajectory data as well as summary tables which describe the focal properties of each ion-optical element. Plots of the trajectories, divergences, and specific transverse planes are also produced. (orig.)

  15. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management

    Science.gov (United States)

    Convertino, Matteo; Valverde, L. James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  16. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Science.gov (United States)

    Convertino, Matteo; Valverde, L James

    2013-01-01

    Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA) framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA) that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the needs of

  17. Portfolio Decision Analysis Framework for Value-Focused Ecosystem Management.

    Directory of Open Access Journals (Sweden)

    Matteo Convertino

    Full Text Available Management of natural resources in coastal ecosystems is a complex process that is made more challenging by the need for stakeholders to confront the prospect of sea level rise and a host of other environmental stressors. This situation is especially true for coastal military installations, where resource managers need to balance conflicting objectives of environmental conservation against military mission. The development of restoration plans will necessitate incorporating stakeholder preferences, and will, moreover, require compliance with applicable federal/state laws and regulations. To promote the efficient allocation of scarce resources in space and time, we develop a portfolio decision analytic (PDA framework that integrates models yielding policy-dependent predictions for changes in land cover and species metapopulations in response to restoration plans, under different climate change scenarios. In a manner that is somewhat analogous to financial portfolios, infrastructure and natural resources are classified as human and natural assets requiring management. The predictions serve as inputs to a Multi Criteria Decision Analysis model (MCDA that is used to measure the benefits of restoration plans, as well as to construct Pareto frontiers that represent optimal portfolio allocations of restoration actions and resources. Optimal plans allow managers to maintain or increase asset values by contrasting the overall degradation of the habitat and possible increased risk of species decline against the benefits of mission success. The optimal combination of restoration actions that emerge from the PDA framework allows decision-makers to achieve higher environmental benefits, with equal or lower costs, than those achievable by adopting the myopic prescriptions of the MCDA model. The analytic framework presented here is generalizable for the selection of optimal management plans in any ecosystem where human use of the environment conflicts with the

  18. Mississippi Curriculum Framework for Welding (Program CIP: 48.0508--Welder/Welding Technologist). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for welding I and II. Presented first are a program description and course…

  19. a computer program for short circuit analysis of electric power systems

    African Journals Online (AJOL)

    ES Obe

    1981-03-01

    Mar 1, 1981 ... framework of a computer program developed for short circuit studies of electric power systems. The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric power system. The program uses the symmetrical ...

  20. Framework for shape analysis of white matter fiber bundles.

    Science.gov (United States)

    Glozman, Tanya; Bruckert, Lisa; Pestilli, Franco; Yecies, Derek W; Guibas, Leonidas J; Yeom, Kristen W

    2018-02-15

    Diffusion imaging coupled with tractography algorithms allows researchers to image human white matter fiber bundles in-vivo. These bundles are three-dimensional structures with shapes that change over time during the course of development as well as in pathologic states. While most studies on white matter variability focus on analysis of tissue properties estimated from the diffusion data, e.g. fractional anisotropy, the shape variability of white matter fiber bundle is much less explored. In this paper, we present a set of tools for shape analysis of white matter fiber bundles, namely: (1) a concise geometric model of bundle shapes; (2) a method for bundle registration between subjects; (3) a method for deformation estimation. Our framework is useful for analysis of shape variability in white matter fiber bundles. We demonstrate our framework by applying our methods on two datasets: one consisting of data for 6 normal adults and another consisting of data for 38 normal children of age 11 days to 8.5 years. We suggest a robust and reproducible method to measure changes in the shape of white matter fiber bundles. We demonstrate how this method can be used to create a model to assess age-dependent changes in the shape of specific fiber bundles. We derive such models for an ensemble of white matter fiber bundles on our pediatric dataset and show that our results agree with normative human head and brain growth data. Creating these models for a large pediatric longitudinal dataset may improve understanding of both normal development and pathologic states and propose novel parameters for the examination of the pediatric brain. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Applications of TOPS Anomaly Detection Framework to Amazon Drought Analysis

    Science.gov (United States)

    Votava, P.; Nemani, R. R.; Ganguly, S.; Michaelis, A.; Hashimoto, H.

    2011-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. We have integrated this knowledge base with a framework for deploying an ensemble of anomaly detection algorithms on large volumes of Earth science datasets and applied it to specific scientific applications that support research conducted by our group. In one early application, we were able to process large number of MODIS, TRMM, CERES data along with ground-based weather and river flow observations to detect the evolution of 2010 drought in the Amazon, identify the affected area, and publish the results in three weeks. A similar analysis of the 2005 drought using the same data sets took nearly 2 years, highlighting the potential contribution of our anomaly framework in accelerating scientific discoveries.

  2. Can Programming Frameworks Bring Smartphones into the Mainstream of Psychological Science?

    Science.gov (United States)

    Piwek, Lukasz; Ellis, David A

    2016-01-01

    Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key issues that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in light of ResearchKit and other recent methodological developments. We conclude that while these programming frameworks are certainly a step in the right direction it remains challenging to create usable research-orientated applications with current frameworks. Smartphones may only become an asset for psychology and social science as a whole when development software that is both easy to use and secure becomes freely available.

  3. An Optimization Framework for Comparative Analysis of Multiple Vehicle Powertrains

    Directory of Open Access Journals (Sweden)

    Ganesh Mohan

    2013-10-01

    Full Text Available With a myriad of alternative vehicle powertrain architectures emerging in the industry, such as electric vehicles and hybrid electric vehicles, it is beneficial that the most appropriate system is chosen for the desired vehicle class and duty cycle, and to minimize a given cost function. This paper investigates this issue, by proposing a novel framework that evaluates different types of powertrain architectures under a unified modular powertrain structure. This framework provides a systematic and objective approach to comparing different types of powertrain architectures simultaneously, and will highlight the benefits that can be achieved from each architecture, thus making it possible to develop the reasoning for manufacturers to implement such systems, and potentially accelerate customer take-up of alternative powertrain technology. The results from this investigation have indicated that such analysis is indeed possible, by way of identifying the “cross-over point” between powertrain architectures, where one powertrain architecture transitions into a different architecture with increments in the required travel range.

  4. HistFitter: a flexible framework for statistical data analysis

    CERN Document Server

    Besjes, G J; Côté, D; Koutsman, A; Lorenz, J M; Short, D

    2015-01-01

    HistFitter is a software framework for statistical data analysis that has been used extensively in the ATLAS Collaboration to analyze data of proton-proton collisions produced by the Large Hadron Collider at CERN. Most notably, HistFitter has become a de-facto standard in searches for supersymmetric particles since 2012, with some usage for Exotic and Higgs boson physics. HistFitter coherently combines several statistics tools in a programmable and flexible framework that is capable of bookkeeping hundreds of data models under study using thousands of generated input histograms.HistFitter interfaces with the statistics tools HistFactory and RooStats to construct parametric models and to perform statistical tests of the data, and extends these tools in four key areas. The key innovations are to weave the concepts of control, validation and signal regions into the very fabric of HistFitter, and to treat these with rigorous methods. Multiple tools to visualize and interpret the results through a simple configura...

  5. Academic Libraries and Quality: An Analysis and Evaluation Framework

    Science.gov (United States)

    Atkinson, Jeremy

    2017-01-01

    The paper proposes and describes a framework for academic library quality to be used by new and more experienced library practitioners and by others involved in considering the quality of academic libraries' services and provision. The framework consists of eight themes and a number of questions to examine within each theme. The framework was…

  6. Roles of Variables and Program Analysis

    OpenAIRE

    Bishop, Craig; Johnson, Colin G.

    2005-01-01

    The idea of roles of variables is to provide a vocabulary for describing the way in which variables are used by experienced programmers. This paper presents work on a system that is designed to automatically check students' role assignments in simple procedural programming. This is achieved by applying program analysis techniques, in particular program slicing and data flow analysis, to programs that students have written and annotated with role assignments.

  7. Analytical framework for recurrence network analysis of time series.

    Science.gov (United States)

    Donges, Jonathan F; Heitzig, Jobst; Donner, Reik V; Kurths, Jürgen

    2012-04-01

    Recurrence networks are a powerful nonlinear tool for time series analysis of complex dynamical systems. While there are already many successful applications ranging from medicine to paleoclimatology, a solid theoretical foundation of the method has still been missing so far. Here, we interpret an ɛ-recurrence network as a discrete subnetwork of a "continuous" graph with uncountably many vertices and edges corresponding to the system's attractor. This step allows us to show that various statistical measures commonly used in complex network analysis can be seen as discrete estimators of newly defined continuous measures of certain complex geometric properties of the attractor on the scale given by ɛ. In particular, we introduce local measures such as the ɛ-clustering coefficient, mesoscopic measures such as ɛ-motif density, path-based measures such as ɛ-betweennesses, and global measures such as ɛ-efficiency. This new analytical basis for the so far heuristically motivated network measures also provides an objective criterion for the choice of ɛ via a percolation threshold, and it shows that estimation can be improved by so-called node splitting invariant versions of the measures. We finally illustrate the framework for a number of archetypical chaotic attractors such as those of the Bernoulli and logistic maps, periodic and two-dimensional quasiperiodic motions, and for hyperballs and hypercubes by deriving analytical expressions for the novel measures and comparing them with data from numerical experiments. More generally, the theoretical framework put forward in this work describes random geometric graphs and other networks with spatial constraints, which appear frequently in disciplines ranging from biology to climate science.

  8. A framework for cognitive task analysis in systems design

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1985-08-01

    The present rapid development if advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task. Such a cognitive task analysis will not aim at a description of the information processes suited for particular control situations. It will rather aim at an analysis in order to identify the requirements to be considered along various dimensions of the decision tasks, in order to give the user - i.e. a decision maker - the freedom to adapt his performance to system requirements in a way which matches his process resources and subjective preferences. To serve this purpose, a number of analyses at various levels are needed to relate the control requirements of the system to the information processes and to the processing resources offered by computers and humans. The paper discusses the cognitive task analysis in terms of the following domains: The problem domain, which is a representation of the functional properties of the system giving a consistent framework for identification of the control requirements of the system; the decision sequences required for typical situations; the mental strategies and heuristics which are effective and acceptable for the different decision functions; and the cognitive control mechanisms used, depending upon the level of skill which can/will be applied. Finally, the end-users' criteria for choice of mental strategies in the actual situation are considered, and the need for development of criteria for judging the ultimate user acceptance of computer support is

  9. Health and Big Data: An Ethical Framework for Health Information Collection by Corporate Wellness Programs.

    Science.gov (United States)

    Ajunwa, Ifeoma; Crawford, Kate; Ford, Joel S

    2016-09-01

    This essay details the resurgence of wellness program as employed by large corporations with the aim of reducing healthcare costs. The essay narrows in on a discussion of how Big Data collection practices are being utilized in wellness programs and the potential negative impact on the worker in regards to privacy and employment discrimination. The essay offers an ethical framework to be adopted by wellness program vendors in order to conduct wellness programs that would achieve cost-saving goals without undue burdens on the worker. The essay also offers some innovative approaches to wellness that may well better serve the goals of healthcare cost reduction. © 2016 American Society of Law, Medicine & Ethics.

  10. A framework for automatic heart sound analysis without segmentation.

    Science.gov (United States)

    Yuenyong, Sumeth; Nishihara, Akinori; Kongprawechnon, Waree; Tungpimolrut, Kanokvate

    2011-02-09

    A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS). The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR), and 0.90 under impulse noise up to 0.3 s duration. The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  11. A framework for automatic heart sound analysis without segmentation

    Directory of Open Access Journals (Sweden)

    Tungpimolrut Kanokvate

    2011-02-01

    Full Text Available Abstract Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS. The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR, and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  12. Structural Equation Models in a Redundancy Analysis Framework With Covariates.

    Science.gov (United States)

    Lovaglio, Pietro Giorgio; Vittadini, Giorgio

    2014-01-01

    A recent method to specify and fit structural equation modeling in the Redundancy Analysis framework based on so-called Extended Redundancy Analysis (ERA) has been proposed in the literature. In this approach, the relationships between the observed exogenous variables and the observed endogenous variables are moderated by the presence of unobservable composites, estimated as linear combinations of exogenous variables. However, in the presence of direct effects linking exogenous and endogenous variables, or concomitant indicators, the composite scores are estimated by ignoring the presence of the specified direct effects. To fit structural equation models, we propose a new specification and estimation method, called Generalized Redundancy Analysis (GRA), allowing us to specify and fit a variety of relationships among composites, endogenous variables, and external covariates. The proposed methodology extends the ERA method, using a more suitable specification and estimation algorithm, by allowing for covariates that affect endogenous indicators indirectly through the composites and/or directly. To illustrate the advantages of GRA over ERA we propose a simulation study of small samples. Moreover, we propose an application aimed at estimating the impact of formal human capital on the initial earnings of graduates of an Italian university, utilizing a structural model consistent with well-established economic theory.

  13. FORTRAN program for induction motor analysis

    Science.gov (United States)

    Bollenbacher, G.

    1976-01-01

    A FORTRAN program for induction motor analysis is described. The analysis includes calculations of torque-speed characteristics, efficiency, losses, magnetic flux densities, weights, and various electrical parameters. The program is limited to three-phase Y-connected, squirrel-cage motors. Detailed instructions for using the program are given. The analysis equations are documented, and the sources of the equations are referenced. The appendixes include a FORTRAN symbol list, a complete explanation of input requirements, and a list of error messages.

  14. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    Science.gov (United States)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  15. Evolutionary squeaky wheel optimization: a new framework for analysis.

    Science.gov (United States)

    Li, Jingpeng; Parkes, Andrew J; Burke, Edmund K

    2011-01-01

    Squeaky wheel optimization (SWO) is a relatively new metaheuristic that has been shown to be effective for many real-world problems. At each iteration SWO does a complete construction of a solution starting from the empty assignment. Although the construction uses information from previous iterations, the complete rebuilding does mean that SWO is generally effective at diversification but can suffer from a relatively weak intensification. Evolutionary SWO (ESWO) is a recent extension to SWO that is designed to improve the intensification by keeping the good components of solutions and only using SWO to reconstruct other poorer components of the solution. In such algorithms a standard challenge is to understand how the various parameters affect the search process. In order to support the future study of such issues, we propose a formal framework for the analysis of ESWO. The framework is based on Markov chains, and the main novelty arises because ESWO moves through the space of partial assignments. This makes it significantly different from the analyses used in local search (such as simulated annealing) which only move through complete assignments. Generally, the exact details of ESWO will depend on various heuristics; so we focus our approach on a case of ESWO that we call ESWO-II and that has probabilistic as opposed to heuristic selection and construction operators. For ESWO-II, we study a simple problem instance and explicitly compute the stationary distribution probability over the states of the search space. We find interesting properties of the distribution. In particular, we find that the probabilities of states generally, but not always, increase with their fitness. This nonmonotonocity is quite different from the monotonicity expected in algorithms such as simulated annealing.

  16. A novel holistic framework for genetic-based captive-breeding and reintroduction programs.

    Science.gov (United States)

    Attard, C R M; Möller, L M; Sasaki, M; Hammer, M P; Bice, C M; Brauer, C J; Carvalho, D C; Harris, J O; Beheregaray, L B

    2016-10-01

    Research in reintroduction biology has provided a greater understanding of the often limited success of species reintroductions and highlighted the need for scientifically rigorous approaches in reintroduction programs. We examined the recent genetic-based captive-breeding and reintroduction literature to showcase the underuse of the genetic data gathered. We devised a framework that takes full advantage of the genetic data through assessment of the genetic makeup of populations before (past component of the framework), during (present component), and after (future component) captive-breeding and reintroduction events to understand their conservation potential and maximize their success. We empirically applied our framework to two small fishes: Yarra pygmy perch (Nannoperca obscura) and southern pygmy perch (Nannoperca australis). Each of these species has a locally adapted and geographically isolated lineage that is endemic to the highly threatened lower Murray-Darling Basin in Australia. These two populations were rescued during Australia's recent decade-long Millennium Drought, when their persistence became entirely dependent on captive-breeding and subsequent reintroduction efforts. Using historical demographic analyses, we found differences and similarities between the species in the genetic impacts of past natural and anthropogenic events that occurred in situ, such as European settlement (past component). Subsequently, successful maintenance of genetic diversity in captivity-despite skewed brooder contribution to offspring-was achieved through carefully managed genetic-based breeding (present component). Finally, genetic monitoring revealed the survival and recruitment of released captive-bred offspring in the wild (future component). Our holistic framework often requires no additional data collection to that typically gathered in genetic-based breeding programs, is applicable to a wide range of species, advances the genetic considerations of reintroduction

  17. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Directory of Open Access Journals (Sweden)

    Ahmad Karim

    Full Text Available Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS, disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  18. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications.

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks' back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps' detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies.

  19. SMARTbot: A Behavioral Analysis Framework Augmented with Machine Learning to Identify Mobile Botnet Applications

    Science.gov (United States)

    Karim, Ahmad; Salleh, Rosli; Khan, Muhammad Khurram

    2016-01-01

    Botnet phenomenon in smartphones is evolving with the proliferation in mobile phone technologies after leaving imperative impact on personal computers. It refers to the network of computers, laptops, mobile devices or tablets which is remotely controlled by the cybercriminals to initiate various distributed coordinated attacks including spam emails, ad-click fraud, Bitcoin mining, Distributed Denial of Service (DDoS), disseminating other malwares and much more. Likewise traditional PC based botnet, Mobile botnets have the same operational impact except the target audience is particular to smartphone users. Therefore, it is import to uncover this security issue prior to its widespread adaptation. We propose SMARTbot, a novel dynamic analysis framework augmented with machine learning techniques to automatically detect botnet binaries from malicious corpus. SMARTbot is a component based off-device behavioral analysis framework which can generate mobile botnet learning model by inducing Artificial Neural Networks’ back-propagation method. Moreover, this framework can detect mobile botnet binaries with remarkable accuracy even in case of obfuscated program code. The results conclude that, a classifier model based on simple logistic regression outperform other machine learning classifier for botnet apps’ detection, i.e 99.49% accuracy is achieved. Further, from manual inspection of botnet dataset we have extracted interesting trends in those applications. As an outcome of this research, a mobile botnet dataset is devised which will become the benchmark for future studies. PMID:26978523

  20. SIDEKICK: Genomic data driven analysis and decision-making framework

    Directory of Open Access Journals (Sweden)

    Yoon Kihoon

    2010-12-01

    Full Text Available Abstract Background Scientists striving to unlock mysteries within complex biological systems face myriad barriers in effectively integrating available information to enhance their understanding. While experimental techniques and available data sources are rapidly evolving, useful information is dispersed across a variety of sources, and sources of the same information often do not use the same format or nomenclature. To harness these expanding resources, scientists need tools that bridge nomenclature differences and allow them to integrate, organize, and evaluate the quality of information without extensive computation. Results Sidekick, a genomic data driven analysis and decision making framework, is a web-based tool that provides a user-friendly intuitive solution to the problem of information inaccessibility. Sidekick enables scientists without training in computation and data management to pursue answers to research questions like "What are the mechanisms for disease X" or "Does the set of genes associated with disease X also influence other diseases." Sidekick enables the process of combining heterogeneous data, finding and maintaining the most up-to-date data, evaluating data sources, quantifying confidence in results based on evidence, and managing the multi-step research tasks needed to answer these questions. We demonstrate Sidekick's effectiveness by showing how to accomplish a complex published analysis in a fraction of the original time with no computational effort using Sidekick. Conclusions Sidekick is an easy-to-use web-based tool that organizes and facilitates complex genomic research, allowing scientists to explore genomic relationships and formulate hypotheses without computational effort. Possible analysis steps include gene list discovery, gene-pair list discovery, various enrichments for both types of lists, and convenient list manipulation. Further, Sidekick's ability to characterize pairs of genes offers new ways to

  1. Short Run Profit Maximization in a Convex Analysis Framework

    Directory of Open Access Journals (Sweden)

    Ilko Vrankic

    2017-03-01

    Full Text Available In this article we analyse the short run profit maximization problem in a convex analysis framework. The goal is to apply the results of convex analysis due to unique structure of microeconomic phenomena on the known short run profit maximization problem where the results from convex analysis are deductively applied. In the primal optimization model the technology in the short run is represented by the short run production function and the normalized profit function, which expresses profit in the output units, is derived. In this approach the choice variable is the labour quantity. Alternatively, technology is represented by the real variable cost function, where costs are expressed in the labour units, and the normalized profit function is derived, this time expressing profit in the labour units. The choice variable in this approach is the quantity of production. The emphasis in these two perspectives of the primal approach is given to the first order necessary conditions of both models which are the consequence of enveloping the closed convex set describing technology with its tangents. The dual model includes starting from the normalized profit function and recovering the production function, and alternatively the real variable cost function. In the first perspective of the dual approach the choice variable is the real wage, and in the second it is the real product price expressed in the labour units. It is shown that the change of variables into parameters and parameters into variables leads to both optimization models which give the same system of labour demand and product supply functions and their inverses. By deductively applying the results of convex analysis the comparative statics results are derived describing the firm's behaviour in the short run.

  2. Analysis of regulatory-ethical framework of clinical trials

    Directory of Open Access Journals (Sweden)

    Milošević-Georgiev Andrijana

    2013-01-01

    Full Text Available Introduction. Every clinical trial has to meet all ethical criteria in addition to the scientific ones. The basic ethical principles in the clinical trials are the following: nonmaleficence, beneficence, respect for autonomy and the principle of justice. Objective. The aim of the study was to analyze clinical cases with the outcomes leading to the changes in regulatory­ethical framework related to the clinical trials, as well as the outcomes of key clinical trials that influenced the introduction of the ethical principles into clinical trials. Methods. This was a descriptive research (methods of analysis and documentation; desk analysis of the secondary data. Results. By analyzing the cases from the secondary sources as well as clinical and ethical outcomes, it may be noticed that the codes, declarations and regulations have been often preceded by certain events that caused their adoption. Moral concern and public awareness of the ethical issues have initiated not only the development of numerous guidelines, codes, and declarations, but also their incorporation into the legislative acts. Conclusion. It is desirable that ethical instruments become legally binding documents, because only in this way will be possible to control all phases of the clinical trials and prevent abuse of the respondents. [Projekat Ministarstva nauke Republike Srbije, br. 175036 i br. 41004

  3. Economic impacts of climate change in Australia: framework and analysis

    International Nuclear Information System (INIS)

    Ford, Melanie

    2007-01-01

    Full text: There is growing interest in understanding the potential impacts of climate change in Australia, and especially the economic impacts of 'inaction'. In this study, a preliminary analysis of the possible economic impacts of future climate change in Australia is undertaken using ABARE's general equilibrium model of the global economy, GTEM. In order to understand the potential economy-wide economic impacts, the broad climatic trends that Australia is likely to experience over the next several decades are canvassed and the potential economic and non-economic impacts on key risk areas, such as water resources, agriculture and forests, health, industry and human settlements and the ecosystems, are identified. A more detailed analysis of the economic impacts of climate change are undertaken by developing two case studies. In the first case study, the economic impact of climate change and reduced water availability on the agricultural sector is assessed in the Murray-Darling Basin. In the second case study, the sectoral economic impacts on the Australian resources sector of a projected decline in global economic activity due to climate change is analysed. The key areas of required development to more fully understand the economy-wide and sectoral impacts of climate change are also discussed including issues associated with estimating both non-market and market impacts. Finally, an analytical framework for undertaking integrated assessment of climate change impacts domestically and globally is developed

  4. A benchmarking program to reduce red blood cell outdating: implementation, evaluation, and a conceptual framework.

    Science.gov (United States)

    Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M

    2015-07-01

    Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.

  5. MetaJC++: A flexible and automatic program transformation technique using meta framework

    Science.gov (United States)

    Beevi, Nadera; Reghu, M.; Chitraprasad, D.; Vinodchandra, S.

    2014-09-01

    Compiler is a tool to translate abstract code containing natural language terms to machine code. Meta compilers are available to compile more than one languages. We have developed a meta framework intends to combine two dissimilar programming languages, namely C++ and Java to provide a flexible object oriented programming platform for the user. Suitable constructs from both the languages have been combined, thereby forming a new and stronger Meta-Language. The framework is developed using the compiler writing tools, Flex and Yacc to design the front end of the compiler. The lexer and parser have been developed to accommodate the complete keyword set and syntax set of both the languages. Two intermediate representations have been used in between the translation of the source program to machine code. Abstract Syntax Tree has been used as a high level intermediate representation that preserves the hierarchical properties of the source program. A new machine-independent stack-based byte-code has also been devised to act as a low level intermediate representation. The byte-code is essentially organised into an output class file that can be used to produce an interpreted output. The results especially in the spheres of providing C++ concepts in Java have given an insight regarding the potential strong features of the resultant meta-language.

  6. Implementation analysis of lean enablers for managing engineering programs

    DEFF Research Database (Denmark)

    von Arnim, Joachim; Oehmen, Josef; Rebentisch, Eric

    2014-01-01

    This paper presents research to improve the applicability of the Lean Enablers and consists of two parts. The first is a case study of a very successful project management maturity improvement initiative at Siemens Industry Sector’s Industry Automation division in the US. It views the initiative...... from the perspective of the Lean Enablers [Oehmen 2012] and is based on information from [Sopko 2012a], [Sopko 2012b], [Sopko 2010], [Sopko 2009], interviews, internal documentation, and the used MSP program management methodology [UK 2011]. The analysis of Lean Enablers incorporated in the MSP...... framework reveals the potential of Lean Enablers being applied in change programs. Incorporating the knowledge gained in the case study, the second part shows the development of a framework for the implementation of Lean Enablers....

  7. Development Roadmap of an Evolvable and Extensible Multi-Mission Telecom Planning and Analysis Framework

    Science.gov (United States)

    Cheung, Kar-Ming; Tung, Ramona H.; Lee, Charles H.

    2003-01-01

    In this paper, we describe the development roadmap and discuss the various challenges of an evolvable and extensible multi-mission telecom planning and analysis framework. Our long-term goal is to develop a set of powerful flexible telecommunications analysis tools that can be easily adapted to different missions while maintain the common Deep Space Communication requirements. The ability of re-using the DSN ground models and the common software utilities in our adaptations has contributed significantly to our development efforts measured in terms of consistency, accuracy, and minimal effort redundancy, which can translate into shorter development time and major cost savings for the individual missions. In our roadmap, we will address the design principles, technical achievements and the associated challenges for following telecom analysis tools (i) Telecom Forecaster Predictor - TFP (ii) Unified Telecom Predictor - UTP (iii) Generalized Telecom Predictor - GTP (iv) Generic TFP (v) Web-based TFP (vi) Application Program Interface - API (vii) Mars Relay Network Planning Tool - MRNPT.

  8. Mississippi Curriculum Framework for Medical Radiologic Technology (Radiography) (CIP: 51.0907--Medical Radiologic Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the radiologic technology program. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies for the program,…

  9. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probabilit...

  10. Fast Fourier Transform Spectral Analysis Program

    Science.gov (United States)

    Daniel, J. A., Jr.; Graves, M. L.; Hovey, N. M.

    1969-01-01

    Fast Fourier Transform Spectral Analysis Program is used in frequency spectrum analysis of postflight, space vehicle telemetered trajectory data. This computer program with a digital algorithm can calculate power spectrum rms amplitudes and cross spectrum of sampled parameters at even time increments.

  11. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent.    Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  12. SIX SIGMA FRAMEWORKS: AN ANALYSIS BASED ON ROGERS’ DIFFUSION OF INNOVATION THEORY

    Directory of Open Access Journals (Sweden)

    Kifayah Amar

    2012-06-01

    Full Text Available This paper attempt to analyze frameworks related to Six Sigma and Lean Six Sigma. The basis of analysis the frameworks is the diffusion of innovation theory. Several criteria was used to analyze the frameworks e.g. relative advantage, compatibility, complexity, trialability, observability, communication channels, nature of the social system/culture and extent of change agent. Based on framework analysis, there is only one framework fits to Rogers’ theory on diffusion of innovation. The framework is a Lean Six Sigma framework which consists elements such owner/manager commitment and involvement, employee involvement, training, culture change and external support. Even though the elements have similarity to other Six Sigma frameworks but they put more attention on culture change and external support. Generally speaking, the culture change and external support are the most important elements to the implementation of Six Sigma or other soft approaches particularly for small organizations.

  13. A Framework for Formal Modeling and Analysis of Organizations

    NARCIS (Netherlands)

    Jonker, C.M.; Sharpanskykh, O.; Treur, J.; P., Yolum

    2007-01-01

    A new, formal, role-based, framework for modeling and analyzing both real world and artificial organizations is introduced. It exploits static and dynamic properties of the organizational model and includes the (frequently ignored) environment. The transition is described from a generic framework of

  14. A Descriptive Analysis of the Institutional Frameworks for Disaster ...

    African Journals Online (AJOL)

    Background:There is insufficient documentation of the institutional frameworks for disaster management and resilience at different levels in sub-Saharan Africa. The objective of this study was to describe the institutional framework for disaster management in Uganda, and to identify actionable gaps at the different levels.

  15. Life Cycle Inventory Analysis of Recycling: Mathematical and Graphical Frameworks

    Directory of Open Access Journals (Sweden)

    Jun Nakatani

    2014-09-01

    Full Text Available A mathematical framework of the life cycle inventory (LCI analysis in life cycle assessment (LCA of recycling is systematically reviewed with the aid of graphical interpretation. First, the zero burden approach, which has been applied to LCI analyses of waste management systems, is theoretically justified in terms of relative comparison of waste management options. As recycling is a multi-functional system including the dual functions of waste management and secondary material production, the allocation issue needs to be handled in LCIs of recycling, and two forms of system expansion, i.e., the avoided burden and product basket approaches, have dominated to avoid the allocation problem. Then, it is demonstrated that conclusions derived from both approaches should mathematically be identical as far as system boundaries are correctly defined. A criticism against system expansion is also reviewed from the viewpoint of ambiguity of what-if scenarios. As an approach to this issue, market-based consequential LCA is discussed in the context of LCI analyses of open-loop recycling.

  16. Sustainability of ARV provision in developing countries: challenging a framework based on program history

    Directory of Open Access Journals (Sweden)

    Thiago Botelho Azeredo

    Full Text Available Abstract The provision of ARVs is central to HIV/AIDS programs, because of its impact on the course of the disease and on quality of life. Although first-line treatments costs have declined, treatment-associated expenses are steeper each year. Sustainability is therefore an important variable for the success of treatment programs. A conceptual framework on sustainability of ARV provision was developed, followed by data collection instruments. The pilot study was undertaken in Brazil. Bolivia, Peru and Mozambique, were visited. Key informants were identified and interviewed. Investigation of sustainability related to ARV provision involved implementation and routinization events of provision schemes. Evidence of greater sustainability potential was observed in Peru, where provision is implemented and routinized by the National HIV/AIDS program and expenditures met by the government. In Mozambique, provision is dependent on donations and external aid, but the country displays a great effort to incorporate ARV provision and care in routine healthcare activities. Bolivia, in addition to external dependence on financing and management of drug supply, presents problems regarding implementation and routinization. The conceptual framework was useful in recognizing events that influence sustainable ARV provision in these countries.

  17. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  18. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  19. Latest Results From the QuakeFinder Statistical Analysis Framework

    Science.gov (United States)

    Kappler, K. N.; MacLean, L. S.; Schneider, D.; Bleier, T.

    2017-12-01

    Since 2005 QuakeFinder (QF) has acquired an unique dataset with outstanding spatial and temporal sampling of earth's magnetic field along several active fault systems. This QF network consists of 124 stations in California and 45 stations along fault zones in Greece, Taiwan, Peru, Chile and Indonesia. Each station is equipped with three feedback induction magnetometers, two ion sensors, a 4 Hz geophone, a temperature sensor, and a humidity sensor. Data are continuously recorded at 50 Hz with GPS timing and transmitted daily to the QF data center in California for analysis. QF is attempting to detect and characterize anomalous EM activity occurring ahead of earthquakes. There have been many reports of anomalous variations in the earth's magnetic field preceding earthquakes. Specifically, several authors have drawn attention to apparent anomalous pulsations seen preceding earthquakes. Often studies in long term monitoring of seismic activity are limited by availability of event data. It is particularly difficult to acquire a large dataset for rigorous statistical analyses of the magnetic field near earthquake epicenters because large events are relatively rare. Since QF has acquired hundreds of earthquakes in more than 70 TB of data, we developed an automated approach for finding statistical significance of precursory behavior and developed an algorithm framework. Previously QF reported on the development of an Algorithmic Framework for data processing and hypothesis testing. The particular instance of algorithm we discuss identifies and counts magnetic variations from time series data and ranks each station-day according to the aggregate number of pulses in a time window preceding the day in question. If the hypothesis is true that magnetic field activity increases over some time interval preceding earthquakes, this should reveal itself by the station-days on which earthquakes occur receiving higher ranks than they would if the ranking scheme were random. This can

  20. High-Fidelity Aerothermal Engineering Analysis for Planetary Probes Using DOTNET Framework and OLAP Cubes Database

    Directory of Open Access Journals (Sweden)

    Prabhakar Subrahmanyam

    2009-01-01

    Full Text Available This publication presents the architecture integration and implementation of various modules in Sparta framework. Sparta is a trajectory engine that is hooked to an Online Analytical Processing (OLAP database for Multi-dimensional analysis capability. OLAP is an Online Analytical Processing database that has a comprehensive list of atmospheric entry probes and their vehicle dimensions, trajectory data, aero-thermal data and material properties like Carbon, Silicon and Carbon-Phenolic based Ablators. An approach is presented for dynamic TPS design. OLAP has the capability to run in one simulation several different trajectory conditions and the output is stored back into the database and can be queried for appropriate trajectory type. An OLAP simulation can be setup by spawning individual threads to run for three types of trajectory: Nominal, Undershoot and Overshoot trajectory. Sparta graphical user interface provides capabilities to choose from a list of flight vehicles or enter trajectory and geometry information of a vehicle in design. DOTNET framework acts as a middleware layer between the trajectory engine and the user interface and also between the web user interface and the OLAP database. Trajectory output can be obtained in TecPlot format, Excel output or in a KML (Keyhole Markup Language format. Framework employs an API (application programming interface to convert trajectory data into a formatted KML file that is used by Google Earth for simulating Earth-entry fly-by visualizations.

  1. Inclusiveness program - a SWOT analysis

    Science.gov (United States)

    Dósa, M.; Szegő, K.

    2017-09-01

    The Inclusiveness Program was created with the aim to integrate currently under-represented countries into the mainstream of European planetary research. Main stages of the working plan include setting up a database containing all the research institutes and universities where astronomical or geophysical research is carried out. It is necessary to identify their problems and needs. Challenging part of the project is to find exact means that help their work in a sustainable way. Strengths, weaknesses, opportunities and threats of the program were identified based on feedback from the inclusiveness community. Our conclusions, further suggestions are presented.

  2. iOS Game Development using SpriteKit Framework with Swift Programming Language

    OpenAIRE

    Gurung, Lal

    2016-01-01

    iOS is a mobile operating system for Apple manufactured phones and tablets. Mobile Gaming Industries are growing very fast, and compatibility with iOS is becoming very popular among game developers. The aim of this Bachelor’s thesis was to find the best available game development tools for iOS platform. The 2D game named Lapland was developed using Apple’s own native framework, SpriteKit. The game was written with the SpriteKit programming language. The combination of SpriteKit and Swift...

  3. The Jasper Framework: Towards a Platform Independent, Formal Treatment of Web Programming

    Directory of Open Access Journals (Sweden)

    James Smith

    2012-10-01

    Full Text Available This paper introduces Jasper, a web programming framework which allows web applications to be developed in an essentially platform indepedent manner and which is also suited to a formal treatment. It outlines Jasper conceptually and shows how Jasper is implemented on several commonplace platforms. It also introduces the Jasper Music Store, a web application powered by Jasper and implemented on each of these platforms. And it briefly describes a formal treatment and outlines the tools and languages planned that will allow this treatment to be automated.

  4. Evaluation of capacity-building program of district health managers in India: a contextualized theoretical framework.

    Science.gov (United States)

    Prashanth, N S; Marchal, Bruno; Kegels, Guy; Criel, Bart

    2014-01-01

    Performance of local health services managers at district level is crucial to ensure that health services are of good quality and cater to the health needs of the population in the area. In many low- and middle-income countries, health services managers are poorly equipped with public health management capacities needed for planning and managing their local health system. In the south Indian Tumkur district, a consortium of five non-governmental organizations partnered with the state government to organize a capacity-building program for health managers. The program consisted of a mix of periodic contact classes, mentoring and assignments and was spread over 30 months. In this paper, we develop a theoretical framework in the form of a refined program theory to understand how such a capacity-building program could bring about organizational change. A well-formulated program theory enables an understanding of how interventions could bring about improvements and an evaluation of the intervention. In the refined program theory of the intervention, we identified various factors at individual, institutional, and environmental levels that could interact with the hypothesized mechanisms of organizational change, such as staff's perceived self-efficacy and commitment to their organizations. Based on this program theory, we formulated context-mechanism-outcome configurations that can be used to evaluate the intervention and, more specifically, to understand what worked, for whom and under what conditions. We discuss the application of program theory development in conducting a realist evaluation. Realist evaluation embraces principles of systems thinking by providing a method for understanding how elements of the system interact with one another in producing a given outcome.

  5. Analysis of higher education policy frameworks for open and distance education in Pakistan.

    Science.gov (United States)

    Ellahi, Abida; Zaka, Bilal

    2015-04-01

    The constant rise in demand for higher education has become the biggest challenge for educational planners. This high demand has paved a way for distance education across the globe. This article innovatively analyzes the policy documentation of a major distance education initiative in Pakistan for validity that will identify the utility of policy linkages. The study adopted a qualitative research design that consisted of two steps. In the first step, a content analysis of distance learning policy framework was made. For this purpose, two documents were accessed titled "Framework for Launching Distance Learning Programs in HEIs of Pakistan" and "Guideline on Quality of Distance Education for External Students at the HEIs of Pakistan." In the second step, the policy guidelines mentioned in these two documents were evaluated at two levels. At the first level, the overall policy documents were assessed against a criterion proposed by Cheung, Mirzaei, and Leeder. At the second level, the proposed program of distance learning was assessed against a criterion set by Gellman-Danley and Fetzner and Berge. The distance education program initiative in Pakistan is of promising nature which needs to be assessed regularly. This study has made an initial attempt to assess the policy document against a criterion identified from literature. The analysis shows that the current policy documents do offer some strengths at this initial level, however, they cannot be considered a comprehensive policy guide. The inclusion or correction of missing or vague areas identified in this study would make this policy guideline document a treasured tool for Higher Education Commission (HEC). For distance education policy makers, this distance education policy framework model recognizes several fundamental areas with which they should be concerned. The findings of this study in the light of two different policy framework measures highlight certain opportunities that can help strengthening the

  6. An equity-effectiveness framework linking health programs and healthy life expectancy.

    Science.gov (United States)

    Banham, David; Lynch, John; Karnon, Jon

    2011-01-01

    South Australia's Strategic Plan includes a target to improve the population's healthy life expectancy. A common question among health policy and service planners is: 'How do health programs and services in the community relate to healthy life expectancy?' In response, this paper outlines an effectiveness and equity framework (EEF) for evaluating health interventions in applied settings. Using the example of coronary heart disease (CHD) management in general practice in South Australia, the EEF: (1) applies an internally consistent approach to accounting for population healthy life expectancy at state and smaller geographic levels; (2) estimates average population health gains from health programs, and gains across different socioeconomic subgroups within the community; (3) conducts economic evaluation by equating health gains against health system costs in population subgroups; (4) summarises relevant information about candidate intervention programs within a multi-criteria performance matrix for presentation to decision makers; (5) reassesses outcomes (and processes) following the implementation of a program and iteratively adds to the relevant knowledge and evidence base. The EEF offers a practical approach to selecting and evaluating intervention programs. The challenge is to develop system culture and data capture methods clearly focussed on linking health system activities to population health outcomes.

  7. ArcGIS Framework for Scientific Data Analysis and Serving

    Science.gov (United States)

    Xu, H.; Ju, W.; Zhang, J.

    2015-12-01

    ArcGIS is a platform for managing, visualizing, analyzing, and serving geospatial data. Scientific data as part of the geospatial data features multiple dimensions (X, Y, time, and depth) and large volume. Multidimensional mosaic dataset (MDMD), a newly enhanced data model in ArcGIS, models the multidimensional gridded data (e.g. raster or image) as a hypercube and enables ArcGIS's capabilities to handle the large volume and near-real time scientific data. Built on top of geodatabase, the MDMD stores the dimension values and the variables (2D arrays) in a geodatabase table which allows accessing a slice or slices of the hypercube through a simple query and supports animating changes along time or vertical dimension using ArcGIS desktop or web clients. Through raster types, MDMD can manage not only netCDF, GRIB, and HDF formats but also many other formats or satellite data. It is scalable and can handle large data volume. The parallel geo-processing engine makes the data ingestion fast and easily. Raster function, definition of a raster processing algorithm, is a very important component in ArcGIS platform for on-demand raster processing and analysis. The scientific data analytics is achieved through the MDMD and raster function templates which perform on-demand scientific computation with variables ingested in the MDMD. For example, aggregating monthly average from daily data; computing total rainfall of a year; calculating heat index for forecasting data, and identifying fishing habitat zones etc. Addtionally, MDMD with the associated raster function templates can be served through ArcGIS server as image services which provide a framework for on-demand server side computation and analysis, and the published services can be accessed by multiple clients such as ArcMap, ArcGIS Online, JavaScript, REST, WCS, and WMS. This presentation will focus on the MDMD model and raster processing templates. In addtion, MODIS land cover, NDFD weather service, and HYCOM ocean model

  8. Alternative Frameworks for Improving Government Organizational Performance: A Comparative Analysis

    National Research Council Canada - National Science Library

    Simon, Cary

    1997-01-01

    .... Six major frameworks emerging in the U.S. since 1980, applicable to the public sector, and designed to enhance organizational change toward improved performance are reviewed and analyzed: Total Quality; 'Excellence...

  9. Parametric experiments on control rod heterogeneity in the framework of the Balzac program

    International Nuclear Information System (INIS)

    Kim, Y.C.; Palmiotti, G.; Salvatores, M.; Soule, R.

    1986-09-01

    The calculation of the so-called heterogeneity effect of control rod reactivity worth, is still to be considered one of the major source of uncertainty in assessing a power reactor control rod worth. The reason for the uncertainty is related both to the fact that at present no specific calculation tool can handle with sufficient precision this effect and that there is limited experimental validation of the approximate available methods. The present note will briefly recall the nature of the problem and its implications, in particular in the framework of the Super Phenix start-up experiments, and will indicate an experimental program on Masurca, aimed to a systematic study of the heterogeneity effect, in the frame of the Balzac program

  10. CaKernel – A Parallel Application Programming Framework for Heterogenous Computing Architectures

    Directory of Open Access Journals (Sweden)

    Marek Blazewicz

    2011-01-01

    Full Text Available With the recent advent of new heterogeneous computing architectures there is still a lack of parallel problem solving environments that can help scientists to use easily and efficiently hybrid supercomputers. Many scientific simulations that use structured grids to solve partial differential equations in fact rely on stencil computations. Stencil computations have become crucial in solving many challenging problems in various domains, e.g., engineering or physics. Although many parallel stencil computing approaches have been proposed, in most cases they solve only particular problems. As a result, scientists are struggling when it comes to the subject of implementing a new stencil-based simulation, especially on high performance hybrid supercomputers. In response to the presented need we extend our previous work on a parallel programming framework for CUDA – CaCUDA that now supports OpenCL. We present CaKernel – a tool that simplifies the development of parallel scientific applications on hybrid systems. CaKernel is built on the highly scalable and portable Cactus framework. In the CaKernel framework, Cactus manages the inter-process communication via MPI while CaKernel manages the code running on Graphics Processing Units (GPUs and interactions between them. As a non-trivial test case we have developed a 3D CFD code to demonstrate the performance and scalability of the automatically generated code.

  11. MSNoise: A framework for Continuous Seismic Noise Analysis

    Science.gov (United States)

    Lecocq, Thomas; Caudron, Corentin; De Plaen, Raphaël; Mordret, Aurélien

    2016-04-01

    MSNoise is an Open and Free Python package known to be the only complete integrated workflow designed to analyse ambient seismic noise and study relative velocity changes (dv/v) in the crust. It is based on state of the art and well maintained Python modules, among which ObsPy plays an important role. To our knowledge, it is officially used for continuous monitoring at least in three notable places: the Observatory of the Piton de la Fournaise volcano (OVPF, France), the Auckland Volcanic Field (New Zealand) and on the South Napa earthquake (Berkeley, USA). It is also used by many researchers to process archive data to focus e.g. on fault zones, intraplate Europe, geothermal exploitations or Antarctica. We first present the general working of MSNoise, originally written in 2010 to automatically scan data archives and process seismic data in order to produce dv/v time series. We demonstrate that its modularity provides a new potential to easily test new algorithms for each processing step. For example, one could experiment new methods of cross-correlation (done by default in the frequency domain), stacking (default is linear stacking, averaging), or dv/v estimation (default is moving window cross-spectrum "MWCS", so-called "doublet"), etc. We present the last major evolution of MSNoise from a "single workflow: data archive to dv/v" to a framework system that allows plugins and modules to be developed and integrated into the MSNoise ecosystem. Small-scale plugins will be shown as examples, such as "continuous PPSD" (à la McNamarra & Buland) or "Seismic Amplitude Ratio Analysis" (Taisne, Caudron). We will also present the new MSNoise-TOMO package, using MSNoise as a "cross-correlation" toolbox and demystifying surface wave tomography ! Finally, the poster will be a meeting point for all those using or willing to use MSNoise, to meet the developer, exchange ideas and wishes !

  12. Disordered eating patterns in coeliac disease: a framework analysis.

    Science.gov (United States)

    Satherley, R-M; Higgs, S; Howard, R

    2017-12-01

    The need for dietary-management in coeliac disease may lead to the development of disordered eating patterns. A theoretical model of disordered eating has been proposed to explain disordered eating in coeliac disease. The aim of this study was to explore the experiences of typical and disordered eating in coeliac disease to gain a greater understanding of these processes and explore specific pathways within this model. We interviewed 21 individuals with coeliac disease, recruited from a previous database, about their experiences with food and food environments. Information about disordered eating status was assessed via questionnaire. The interviews were analysed qualitatively using Framework analysis, which was underpinned by the theoretical model of disordered eating in coeliac disease. Experiences differed between participants scoring high on measures of disordered eating and those who scored low (typical eaters). Participants scoring high on measures of disordered eating were concerned about the consequences of their gluten-free diet on body image and they described eating patterns similar to binge/restrict cycles. Typical eaters reported being able to integrate their dietary self-management into their daily lives; however, general concerns around food and cross-contamination were associated with a restriction in food intake. Coeliac disease has a varied impact on eating patterns. The need to follow a gluten-free diet and to be vigilant around food has to be balanced with concerns around food availability and cross-contamination which have the potential to contribute towards disordered eating attitudes and behaviours. The findings suggest that the theoretical model of disordered eating provides an adequate explanation of disordered eating patterns in coeliac disease. © 2017 The British Dietetic Association Ltd.

  13. A Conceptual Framework over Contextual Analysis of Concept Learning within Human-Machine Interplays

    DEFF Research Database (Denmark)

    Badie, Farshad

    2016-01-01

    ) a well-structured machine concept learning framework. Accordingly, I will, semantically and epistemologically, focus on linking those two frameworks for logical analysis of concept learning in the context of human-machine interrelationships. It will be demonstrated that the proposed framework provides...... a supportive structure over the described contextualisation of ‘relations’ between human beings and machines within concept learning processes.......This research provides a contextual description concerning existential and structural analysis of ‘Relations’ between human beings and machines. Subsequently, it will focus on conceptual and epistemological analysis of (i) my own semantics-based framework [for human meaning construction] and of (ii...

  14. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  15. A framework for conducting a national study of substance abuse treatment programs serving American Indian and Alaska native communities.

    Science.gov (United States)

    Novins, Douglas K; Moore, Laurie A; Beals, Janette; Aarons, Gregory A; Rieckmann, Traci; Kaufman, Carol E

    2012-09-01

    Because of their broad geographic distribution, diverse ownership and operation, and funding instability, it is a challenge to develop a framework for studying substance abuse treatment programs serving American Indian and Alaska Native communities at a national level. This is further complicated by the historic reluctance of American Indian and Alaska Native communities to participate in research. We developed a framework for studying these substance abuse treatment programs (n ≈ 293) at a national level as part of a study of attitudes toward, and use of, evidence-based treatments among substance abuse treatment programs serving AI/AN communities with the goal of assuring participation of a broad array of programs and the communities that they serve. Because of the complexities of identifying specific substance abuse treatment programs, the sampling framework divides these programs into strata based on the American Indian and Alaska Native communities that they serve: (1) the 20 largest tribes (by population); (2) urban AI/AN clinics; (3) Alaska Native Health Corporations; (4) other Tribes; and (5) other regional programs unaffiliated with a specific AI/AN community. In addition, the recruitment framework was designed to be sensitive to likely concerns about participating in research. This systematic approach for studying substance abuse and other clinical programs serving AI/AN communities assures the participation of diverse AI/AN programs and communities and may be useful in designing similar national studies.

  16. The PUMA test program and data analysis

    International Nuclear Information System (INIS)

    Han, J.T.; Morrison, D.L.

    1997-01-01

    The PUMA test program is sponsored by the U.S. Nuclear Regulatory Commission to provide data that are relevant to various Boiling Water Reactor phenomena. The author briefly describes the PUMA test program and facility, presents the objective of the program, provides data analysis for a large-break loss-of-coolant accident test, and compares the data with a RELAP5/MOD 3.1.2 calculation

  17. A Practical Framework for Evaluating Health Services Management Educational Program: The Application of The Mixed-Method Sequential Explanatory Design

    Directory of Open Access Journals (Sweden)

    Bazrafshan Azam

    2015-07-01

    Full Text Available Introduction:Health services managers are responsible for improving the efficiency and quality in delivering healthcare services. In this regard, Health Services Management (HSM programs have been widely established to provide health providers with skilled, professional managers to address those needs. It is therefore important to ascertain the quality of these programs. The purpose of this study was to synthesize and develop a framework to evaluate the quality of the Health Services Management (HSM program at Kerman University of Medical Sciences. Methods: This study followed a mixed-method sequential explanatory approach in which data were collected through a CIPP survey and semi-structured interviews. In phase 1, participants included 10 faculty members, 64 students and 90 alumni. In phase 2, in-depth semi-structured interviews and purposeful sampling were conducted with 27 participants to better understand their perceptions of the HSM program. All interviews were audio-taped and transcribed verbatim. NVivo N8 was used to analyze the qualitative data and extract the themes. Results: The data analysis revealed both positive and negative attitudes toward the HSM program. According to the CIPP survey, program objectives (74%, curriculum content (59.5% and graduate skills (79% were the major sources of dissatisfaction. However, most respondents (n=48 reported that the classes are well equipped and learning resources are well prepared (n=41. Most respondents (n=41 reported that the students are actively involved in classroom activities. The majority of respondents (n=43 pointed out that the instructors implemented appropriate teaching strategies. Qualitative analysis of interviews revealed that a regular community needs assessment, content revision and directing attention to graduate skills and expertise are the key solutions to improve the program’s quality.Conclusion: This study revealed to what extent the HSM program objectives is being

  18. Framework and criteria for program evaluation in the Office of Conservation and Renewable Energy

    Energy Technology Data Exchange (ETDEWEB)

    1981-04-30

    This study addresses the development of a framework and generic criteria for conducting program evaluation in the Office of Conservation and Renewable Energy. The evaluation process is intended to provide the Assistant Secretary with comprehensive and consistent evaluation data for management decisions regarding policy and strategy, crosscutting energy impacts and resource allocation and justification. The study defines evaluation objectives, identifies basic information requirements (criteria), and identifies a process for collecting evaluation results at the basic program level, integrating the results, and summarizing information upward through the CE organization to the Assistant Secretary. Methods are described by which initial criteria were tested, analyzed, and refined for CE program applicability. General guidelines pertaining to evaluation and the Sunset Review requirements are examined and various types, designs, and models for evaluation are identified. Existing CE evaluation reports are reviewed and comments on their adequacy for meeting current needs are provided. An inventory and status survey of CE program evaluation activities is presented, as are issues, findings, and recommendations pertaining to CE evaluation and Sunset Review requirements. Also, sources of data for use in evaluation and the Sunset Review response are identified. An inventory of CE evaluation-related documents and reports is provided.

  19. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  20. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  1. Toward improving the reliability of hydrologic prediction: Model structure uncertainty and its quantification using ensemble-based genetic programming framework

    Science.gov (United States)

    Parasuraman, Kamban; Elshorbagy, Amin

    2008-12-01

    Uncertainty analysis is starting to be widely acknowledged as an integral part of hydrological modeling. The conventional treatment of uncertainty analysis in hydrologic modeling is to assume a deterministic model structure, and treat its associated parameters as imperfectly known, thereby neglecting the uncertainty associated with the model structure. In this paper, a modeling framework that can explicitly account for the effect of model structure uncertainty has been proposed. The modeling framework is based on initially generating different realizations of the original data set using a non-parametric bootstrap method, and then exploiting the ability of the self-organizing algorithms, namely genetic programming, to evolve their own model structure for each of the resampled data sets. The resulting ensemble of models is then used to quantify the uncertainty associated with the model structure. The performance of the proposed modeling framework is analyzed with regards to its ability in characterizing the evapotranspiration process at the Southwest Sand Storage facility, located near Ft. McMurray, Alberta. Eddy-covariance-measured actual evapotranspiration is modeled as a function of net radiation, air temperature, ground temperature, relative humidity, and wind speed. Investigating the relation between model complexity, prediction accuracy, and uncertainty, two sets of experiments were carried out by varying the level of mathematical operators that can be used to define the predictand-predictor relationship. While the first set uses just the additive operators, the second set uses both the additive and the multiplicative operators to define the predictand-predictor relationship. The results suggest that increasing the model complexity may lead to better prediction accuracy but at an expense of increasing uncertainty. Compared to the model parameter uncertainty, the relative contribution of model structure uncertainty to the predictive uncertainty of a model is

  2. Present status of structural analysis computer programs

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Sanokawa, Konomo; Takeda, Hiroshi.

    1981-01-01

    The computer programs for the structural analysis by finite element method have been used widely, and the authors carried out the bench mark test on the computer programs for finite element method already. As the result, they pointed out a number of problems concerning the use of the computer programs for finite element method. In this paper, the details of their development, the analytical function and the examples of calculation are described centering around the versatile computer programs used for the previous study. As the versatile computer programs for finite element method, ANSYS developed by Swanson Analysis System Co., USA, ASKA developed by ISD, West Germany, MARC developed by MARC Analysis Research Institute, NASTRAN developed by NASA, USA, SAP-4 developed by University of California, ADINA developed by MIT, NEPSAP developed by Lockheed Missile Space Co., BERSAFE developed by CEGB, Great Britain, EPACA developed by Franklin Research Institute, USA, and CREEP-PLAST developed by GE are briefly introduced. As the exampled of calculation, the thermal elastoplastic creep analysis of a cylinder by ANSYS, the elastoplastic analysis of a pressure vessel by ASKA, the analysis of a plate with double cracks by MARC, the analysis of the buckling of a shallow arch by MSC-NASTRAN, and the elastoplastic analysis of primary cooling pipes by ADINA are explained. (Kako, I.)

  3. A Framework for Analysis of Music Similarity Measures

    DEFF Research Database (Denmark)

    Jensen, Jesper Højvang; Christensen, Mads G.; Jensen, Søren Holdt

    2007-01-01

    To analyze specific properties of music similarity measures that the commonly used genre classification evaluation procedure does not reveal, we introduce a MIDI based test framework for music similarity measures. We introduce the framework by example and thus outline an experiment to analyze...... the dependency of a music similarity measure on the instrumentation of a song compared to the melody, and to analyze its sensitivity to transpositions. Using the outlined experiment, we analyze music similarity measures from three software packages, namely Marsyas, MA toolbox and Intelligent Sound Processing...

  4. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  5. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    Directory of Open Access Journals (Sweden)

    Haerin Lee

    2017-02-01

    Full Text Available In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  6. Mississippi Curriculum Framework for Diesel Equipment Technology (CIP: 47.0605--Diesel Engine Mechanic & Repairer). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for the course sequences in the diesel equipment technology programs cluster. Presented in the introductory section are a description of the program and suggested course sequence. Section I lists baseline competencies,…

  7. Polyglot programming in applications used for genetic data analysis.

    Science.gov (United States)

    Nowak, Robert M

    2014-01-01

    Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.

  8. Analysis of Idiom Variation in the Framework of Linguistic Subjectivity

    Science.gov (United States)

    Liu, Zhengyuan

    2012-01-01

    Idiom variation is a ubiquitous linguistic phenomenon which has raised a lot of research questions. The past approach was either formal or functional. Both of them did not pay much attention to cognitive factors of language users. By putting idiom variation in the framework of linguistic subjectivity, we have offered a new perspective in the…

  9. A Cognitive Framework for the Analysis of Online Chemistry Courses

    Science.gov (United States)

    Evans, Karen L.; Leinhardt, Gaea

    2008-01-01

    Many students now are receiving instruction in online environments created by universities, museums, corporations, and even students. What features of a given online course contribute to its effectiveness? This paper addresses that query by proposing and applying an analytic framework to five online introductory chemistry courses. Introductory…

  10. Micro-costing in public health economics: steps towards a standardized framework, using the incredible years toddler parenting program as a worked example.

    Science.gov (United States)

    Charles, J M; Edwards, R T; Bywater, T; Hutchings, J

    2013-08-01

    Complex interventions, such as parenting programs, are rarely evaluated from a public sector, multi-agency perspective. An exception is the Incredible Years (IY) Basic Parenting Program; which has a growing clinical and cost-effectiveness evidence base for preventing or reducing children's conduct problems. The aim of this paper was to provide a micro-costing framework for use by future researchers, by micro-costing the 12-session IY Toddler Parenting Program from a public sector, multi-agency perspective. This micro-costing was undertaken as part of a community-based randomized controlled trial of the program in disadvantaged Flying Start areas in Wales, U.K. Program delivery costs were collected by group leader cost diaries. Training and supervision costs were recorded. Sensitivity analysis assessed the effects of a London cost weighting and group size. Costs were reported in 2008/2009 pounds sterling. Direct program initial set-up costs were £3305.73; recurrent delivery costs for the program based on eight parents attending a group were £752.63 per child, falling to £633.61 based on 10 parents. Under research contexts (with weekly supervision) delivery costs were £1509.28 per child based on eight parents, falling to £1238.94 per child based on 10 parents. When applying a London weighting, overall program costs increased in all contexts. Costs at a micro-level must be accurately calculated to conduct meaningful cost-effectiveness/cost-benefit analysis. A standardized framework for assessing costs is needed; this paper outlines a suggested framework. In prevention science it is important for decision makers to be aware of intervention costs in order to allocate scarce resources effectively.

  11. Tracking and Analysis Framework (TAF) model documentation and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Bloyd, C.; Camp, J.; Conzelmann, G. [and others

    1996-12-01

    With passage of the 1990 Clean Air Act Amendments, the United States embarked on a policy for controlling acid deposition that has been estimated to cost at least $2 billion. Title IV of the Act created a major innovation in environmental regulation by introducing market-based incentives - specifically, by allowing electric utility companies to trade allowances to emit sulfur dioxide (SO{sub 2}). The National Acid Precipitation Assessment Program (NAPAP) has been tasked by Congress to assess what Senator Moynihan has termed this {open_quotes}grand experiment.{close_quotes} Such a comprehensive assessment of the economic and environmental effects of this legislation has been a major challenge. To help NAPAP face this challenge, the U.S. Department of Energy (DOE) has sponsored development of an integrated assessment model, known as the Tracking and Analysis Framework (TAF). This section summarizes TAF`s objectives and its overall design.

  12. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ``Energy Efficiency, Developing Countries, and Eastern Europe,`` part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program`s researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  13. A model-based framework for the analysis of team communication in nuclear power plants

    International Nuclear Information System (INIS)

    Chung, Yun Hyung; Yoon, Wan Chul; Min, Daihwan

    2009-01-01

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants

  14. Constructing functional programs for grammar analysis problems

    NARCIS (Netherlands)

    Jeuring, J.T.; Swierstra, S.D.

    1995-01-01

    This paper discusses the derivation of functional programs for grammar analysis problems, such as the Empty problem and the Reachable problem. Grammar analysis problems can be divided into two classes: top-down problems such as Follow and Reachable, which are described in terms of the contexts of

  15. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  16. Formulation, construction and analysis of kinetic models of metabolism: A review of modelling frameworks

    DEFF Research Database (Denmark)

    Saa, Pedro A.; Nielsen, Lars K.

    2017-01-01

    Kinetic models are critical to predict the dynamic behaviour of metabolic networks. Mechanistic kinetic models for large networks remain uncommon due to the difficulty of fitting their parameters. Recent modelling frameworks promise new ways to overcome this obstacle while retaining predictive...... capabilities. In this review, we present an overview of the relevant mathematical frameworks for kinetic formulation, construction and analysis. Starting with kinetic formalisms, we next review statistical methods for parameter inference, as well as recent computational frameworks applied to the construction...

  17. Implementation and Evaluation of Technology Mentoring Program Developed for Teacher Educators: A 6M-Framework

    Directory of Open Access Journals (Sweden)

    Selim Gunuc

    2015-06-01

    Full Text Available The purpose of this basic research is to determine the problems experienced in the Technology Mentoring Program (TMP, and the study discusses how these problems affect the process in general. The implementation was carried out with teacher educators in the education faculty. 8 doctorate students (mentors provided technology mentoring implementation for one academic term to 9 teacher educators (mentees employed in the Education Faculty. The data were collected via the mentee and the mentor interview form, mentor reflections and organization meeting reflections. As a result, the problems based on the mentor, on the mentee and on the organization/institution were determined. In order to carry out TMP more effectively and successfully, a 6M-framework (Modifying, Meeting, Matching, Managing, Mentoring - Monitoring was suggested within the scope of this study. It could be stated that fewer problems will be encountered and that the process will be carried out more effectively and successfully when the structure in this framework is taken into consideration.

  18. A methodological approach and framework for sustainability assessment in NGO-implemented primary health care programs.

    Science.gov (United States)

    Sarriot, Eric G; Winch, Peter J; Ryan, Leo J; Bowie, Janice; Kouletio, Michelle; Swedberg, Eric; LeBan, Karen; Edison, Jay; Welch, Rikki; Pacqué, Michel C

    2004-01-01

    An estimated 10.8 million children under 5 continue to die each year in developing countries from causes easily treatable or preventable. Non governmental organizations (NGOs) are frontline implementers of low-cost and effective child health interventions, but their progress toward sustainable child health gains is a challenge to evaluate. This paper presents the Child Survival Sustainability Assessment (CSSA) methodology--a framework and process--to map progress towards sustainable child health from the community level and upward. The CSSA was developed with NGOs through a participatory process of research and dialogue. Commitment to sustainability requires a systematic and systemic consideration of human, social and organizational processes beyond a purely biomedical perspective. The CSSA is organized around three interrelated dimensions of evaluation: (1) health and health services; (2) capacity and viability of local organizations; (3) capacity of the community in its social ecological context. The CSSA uses a participatory, action-planning process, engaging a 'local system' of stakeholders in the contextual definition of objectives and indicators. Improved conditions measured in the three dimensions correspond to progress toward a sustainable health situation for the population. This framework opens new opportunities for evaluation and research design and places sustainability at the center of primary health care programming.

  19. Tatool: a Java-based open-source programming framework for psychological studies.

    Science.gov (United States)

    von Bastian, Claudia C; Locher, André; Ruflin, Michael

    2013-03-01

    Tatool (Training and Testing Tool) was developed to assist researchers with programming training software, experiments, and questionnaires. Tatool is Java-based, and thus is a platform-independent and object-oriented framework. The architecture was designed to meet the requirements of experimental designs and provides a large number of predefined functions that are useful in psychological studies. Tatool comprises features crucial for training studies (e.g., configurable training schedules, adaptive training algorithms, and individual training statistics) and allows for running studies online via Java Web Start. The accompanying "Tatool Online" platform provides the possibility to manage studies and participants' data easily with a Web-based interface. Tatool is published open source under the GNU Lesser General Public License, and is available at www.tatool.ch.

  20. Assessing environmental assets for health promotion program planning: a practical framework for health promotion practitioners.

    Science.gov (United States)

    Springer, Andrew E; Evans, Alexandra E

    2016-01-01

    Conducting a health needs assessment is an important if not essential first step for health promotion planning. This paper explores how health needs assessments may be further strengthened for health promotion planning via an assessment of environmental assets rooted in the multiple environments (policy, information, social and physical environments) that shape health and behavior. Guided by a behavioral-ecological perspective- one that seeks to identify environmental assets that can influence health behavior, and an implementation science perspective- one that seeks to interweave health promotion strategies into existing environmental assets, we present a basic framework for assessing environmental assets and review examples from the literature to illustrate the incorporation of environmental assets into health program design. Health promotion practitioners and researchers implicitly identify and apply environmental assets in the design and implementation of health promotion interventions;this paper provides foundation for greater intentionality in assessing environmental assets for health promotion planning.

  1. Energy Analysis Program 1990 annual report

    International Nuclear Information System (INIS)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, ''Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings

  2. Energy Analysis Program 1990 annual report

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The Energy Analysis Program has played an active role in the analysis and discussion of energy and environmental issues at several levels. (1) at the international level, with programs as developing scenarios for long-term energy demand in developing countries and organizing leading an analytic effort, Energy Efficiency, Developing Countries, and Eastern Europe,'' part of a major effort to increase support for energy efficiency programs worldwide; (2) at national level, the Program has been responsible for assessing energy forecasts and policies affecting energy use (e.g., appliance standards, National Energy Strategy scenarios); and (3) at the state and utility levels, the Program has been a leader in promoting integrated resource utility planning; the collaborative process has led to agreement on a new generation of utility demand-site programs in California, providing an opportunity to use knowledge and analytic techniques of the Program's researchers. We continue to place highest on analyzing energy efficiency, with particular attention given to energy use in buildings. The Program continues its active analysis of international energy issues in Asia (including China), the Soviet Union, South America, and Western Europe. Analyzing the costs and benefits of different levels of standards for residential appliances continues to be the largest single area of research within the Program. The group has developed and applied techniques for forecasting energy demand (or constructing scenarios) for the United States. We have built a new model of industrial energy demand, are in the process of making major changes in our tools for forecasting residential energy demand, have built an extensive and documented energy conservation supply curve of residential energy use, and are beginning an analysis of energy-demand forecasting for commercial buildings.

  3. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    control with the building designer. Consequence based design is defined by the specific use of integrated dynamic modeling, which includes the parametric capabilities of a scripting tool and building simulation features of a building performance simulation tool. The framework can lead to enhanced......In the wake of uncompromising requirements on building performance and the current emphasis on sustainability, including building energy and indoor environment, designing buildings involves elements of expertise of multiple disciplines. However, building performance analyses, including those...... of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...

  4. ROOT — A C++ framework for petabyte data storage, statistical analysis and visualization

    Science.gov (United States)

    Antcheva, I.; Ballintijn, M.; Bellenot, B.; Biskup, M.; Brun, R.; Buncic, N.; Canal, Ph.; Casadei, D.; Couet, O.; Fine, V.; Franco, L.; Ganis, G.; Gheata, A.; Maline, D. Gonzalez; Goto, M.; Iwaszkiewicz, J.; Kreshuk, A.; Segura, D. Marcos; Maunder, R.; Moneta, L.; Naumann, A.; Offermann, E.; Onuchin, V.; Panacek, S.; Rademakers, F.; Russo, P.; Tadel, M.

    2009-12-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web, or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks — e.g. data mining in HEP — by using PROOF, which will take care of optimally

  5. Property Regimes in Resource Conservation-A Framework for Analysis

    OpenAIRE

    Hasan, Lubna

    2000-01-01

    This paper develops a conceptual framework for analysing property regimes in conservation of natural resources. Human beings interaction with their environment is governed through institutions of property; therefore they play an important role in the conservation of natural resources. This paper uses concepts from the New Institutional Economics School of thought and from theories of property to develop normative criteria to assess property institutions in resource management.

  6. WWW-based remote analysis framework for UniSampo and Shaman analysis software

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Ala-Heikkilae, J.J.; Routti, J.T.; Nikkinen, M.T.

    2005-01-01

    UniSampo and Shaman are well-established analytical tools for gamma-ray spectrum analysis and the subsequent radionuclide identification. These tools are normally run locally on a Unix or Linux workstation in interactive mode. However, it is also possible to run them in batch/non-interactive mode by starting them with the correct parameters. This is how they are used in the standard analysis pipeline operation. This functionality also makes it possible to use them for remote operation over the network. Framework for running UniSampo and Shaman analysis using the standard WWW-protocol has been developed. A WWW-server receives requests from the client WWW-browser and runs the analysis software via a set of CGI-scripts. Authentication, input data transfer, and output and display of the final analysis results is all carried out using standard WWW-mechanisms. This WWW-framework can be utilized, for example, by organizations that have radioactivity surveillance stations in a wide area. A computer with a standard internet/intranet connection suffices for on-site analyses. (author)

  7. Globalization and health: a framework for analysis and action.

    Science.gov (United States)

    Woodward, D.; Drager, N.; Beaglehole, R.; Lipson, D.

    2001-01-01

    Globalization is a key challenge to public health, especially in developing countries, but the linkages between globalization and health are complex. Although a growing amount of literature has appeared on the subject, it is piecemeal, and suffers from a lack of an agreed framework for assessing the direct and indirect health effects of different aspects of globalization. This paper presents a conceptual framework for the linkages between economic globalization and health, with the intention that it will serve as a basis for synthesizing existing relevant literature, identifying gaps in knowledge, and ultimately developing national and international policies more favourable to health. The framework encompasses both the indirect effects on health, operating through the national economy, household economies and health-related sectors such as water, sanitation and education, as well as more direct effects on population-level and individual risk factors for health and on the health care system. Proposed also is a set of broad objectives for a programme of action to optimize the health effects of economic globalization. The paper concludes by identifying priorities for research corresponding with the five linkages identified as critical to the effects of globalization on health. PMID:11584737

  8. Testing a model of L2 communication among Iranian EFL learners: A Path Analysis Framework

    Directory of Open Access Journals (Sweden)

    Nasser Fallah

    2015-02-01

    Full Text Available Using willingness to communicate (WTC and socio-educational models as a framework, the present study aimed at examining WTC in English and its underlying variables in a sample of 372 Iranian non-English major EFL learners. The data were collected through self-reported questionnaires. Path analysis framework using the Amos Program with maximum likelihood estimation was also utilized to examine the hypothesized model and the potential relationships between the variables. The final model showed a very good fit to the data. The results of structural equation modeling revealed that self-perceived communication competence (SPCC, international posture and motivation were significant predictors of L2WTC. The findings also showed that L2 communication anxiety (CA, motivation, personality trait of agreeableness and teacher immediacy could exert indirect effects on L2WTC. Furthermore, each of teacher immediacy and agreeableness variables predicted both international posture and CA among the EFL learners. Following these findings, potential factors affecting learners WTC should receive sufficient attention by teachers, administrators and learners alike. By adopting more immediacy behaviors, EFL teachers can also establish relaxing and supportive classroom climate and lower the learners’ affective filter. In such an atmosphere learners are more emotionally secured, suffer less communication apprehension, perceive themselves to be more proficient and motivated, obtain promoted international posture by forming realistic attitudes toward different cultures, and consequently become more willing to communicate in English.

  9. The Tracking and Analysis Framework (TAF): A tool for the integrated assessment of acid deposition

    International Nuclear Information System (INIS)

    Bloyd, C.N.; Henrion, M.; Marnicio, R.J.

    1995-01-01

    A major challenge that has faced policy makers concerned with acid deposition is obtaining an integrated view of the underlying science related to acid deposition. In response to this challenge, the US Department of Energy is sponsoring the development of an integrated Tracking and Analysis Framework (TAF) which links together the key acid deposition components of emissions, air transport, atmospheric deposition, and aquatic effects in a single modeling structure. The goal of TAF is to integrate credible models of the scientific and technical issues into an assessment framework that can directly address key policy issues, and in doing so act as a bridge between science and policy. Key objectives of TAF are to support coordination and communication among scientific researchers; to support communications with policy makers, and to provide rapid response for analyzing newly emerging policy issues; and to provide guidance for prioritizing research programs. This paper briefly describes how TAF was formulated to meet those objectives and the underlying principals which form the basis for its development

  10. APRECOT - analysis program for reactivity coefficient tests

    International Nuclear Information System (INIS)

    Telford, A.R.R.

    1979-05-01

    A computer program has been written which provides a rapid and convenient analysis route for fuel temperature coefficient of reactivity measurements, as carried out at Hinkley Point 'B' Power Station. This replaces the earlier, more tedious, iterative analysis using KINAGRAX. The program has been tested by analysing computer simulations of reactor tests. This has shown that APRECOT introduces errors which are small (approximately 11/2%) in comparison with other sources of error (approximately 10%), that the effect of axial flux shape changes is acceptably small and that effects due to xenon, which is not modelled in the current version of the program, can be dealt with adequately. This note describes the APRECOT method, including details of input and output to the program and gives results of the numerical tests made of the method. (author)

  11. Probabilistic Resource Analysis by Program Transformation

    DEFF Research Database (Denmark)

    Kirkeby, Maja Hanne; Rosendahl, Mads

    2016-01-01

    The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates a proba...... a probability distribution of the resource usage as a possibly uncomputable expression and then transforms it into a closed form expression using over-approximations. We present the technique, outline the implementation and show results from experiments with the system.......The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...

  12. A causal analysis framework for land-use change and the potential role of bioenergy policy

    NARCIS (Netherlands)

    Efroymson, Rebecca A.; Kline, Keith L.; Angelsen, Arild; Verburg, Peter H.; Dale, Virginia H.; Langeveld, Johannes W.A.; McBride, Allen

    2016-01-01

    We propose a causal analysis framework to increase understanding of land-use change (LUC) and the reliability of LUC models. This health-sciences-inspired framework can be applied to determine probable causes of LUC in the context of bioenergy. Calculations of net greenhouse gas (GHG) emissions for

  13. Protocol Analysis of Group Problem Solving in Mathematics: A Cognitive-Metacognitive Framework for Assessment.

    Science.gov (United States)

    Artzt, Alice F.; Armour-Thomas, Eleanor

    The roles of cognition and metacognition were examined in the mathematical problem-solving behaviors of students as they worked in small groups. As an outcome, a framework that links the literature of cognitive science and mathematical problem solving was developed for protocol analysis of mathematical problem solving. Within this framework, each…

  14. Automated Program Analysis for Cybersecurity (APAC)

    Science.gov (United States)

    2016-07-14

    AUTOMATED PROGRAM ANALYSIS FOR CYBERSECURITY ( APAC ) FIVE DIRECTIONS, INC JULY 2016 FINAL TECHNICAL REPORT APPROVED...CYBERSECURITY ( APAC ) 5a. CONTRACT NUMBER FA8750-14-C-0050 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR(S) William Arbaugh...5d. PROJECT NUMBER APAC 5e. TASK NUMBER SD 5f. WORK UNIT NUMBER IR 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Five Directions, Inc

  15. Framework for Financial Ratio Analysis of Audited Federal Financial Reports

    National Research Council Canada - National Science Library

    Brady, Richard

    1999-01-01

    .... The disclosure of this type of information, it was believed, would enable decision-makers to understand the financial implications of budgetary, policy and program issues and provide an analytical...

  16. Debugging Nondeterministic Failures in Linux Programs through Replay Analysis

    Directory of Open Access Journals (Sweden)

    Shakaiba Majeed

    2018-01-01

    Full Text Available Reproducing a failure is the first and most important step in debugging because it enables us to understand the failure and track down its source. However, many programs are susceptible to nondeterministic failures that are hard to reproduce, which makes debugging extremely difficult. We first address the reproducibility problem by proposing an OS-level replay system for a uniprocessor environment that can capture and replay nondeterministic events needed to reproduce a failure in Linux interactive and event-based programs. We then present an analysis method, called replay analysis, based on the proposed record and replay system to diagnose concurrency bugs in such programs. The replay analysis method uses a combination of static analysis, dynamic tracing during replay, and delta debugging to identify failure-inducing memory access patterns that lead to concurrency failure. The experimental results show that the presented record and replay system has low-recording overhead and hence can be safely used in production systems to catch rarely occurring bugs. We also present few concurrency bug case studies from real-world applications to prove the effectiveness of the proposed bug diagnosis framework.

  17. Implementation of a Systematic Accountability Framework in 2014 to Improve the Performance of the Nigerian Polio Program.

    Science.gov (United States)

    Tegegne, Sisay G; MKanda, Pascal; Yehualashet, Yared G; Erbeto, Tesfaye B; Touray, Kebba; Nsubuga, Peter; Banda, Richard; Vaz, Rui G

    2016-05-01

    An accountability framework is a central feature of managing human and financial resources. One of its primary goals is to improve program performance through close monitoring of selected priority activities. The principal objective of this study was to determine the contribution of a systematic accountability framework to improving the performance of the World Health Organization (WHO)-Nigeria polio program staff, as well as the program itself. The effect of implementation of the accountability framework was evaluated using data on administrative actions and select process indicators associated with acute flaccid paralysis (AFP) surveillance, routine immunization, and polio supplemental immunization activities. Data were collected in 2014 during supportive supervision, using Magpi software (a company that provides service to collect data using mobile phones). A total of 2500 staff were studied. Data on administrative actions and process indicators from quarters 2-4 in 2014 were compared. With respect to administrative actions, 1631 personnel (74%) received positive feedback (written or verbal commendation) in quarter 4 through the accountability framework, compared with 1569 (73%) and 1152 (61%) during quarters 3 and 2, respectively. These findings accorded with data on process indicators associated with AFP surveillance and routine immunization, showing statistically significant improvements in staff performance at the end of quarter 4, compared with other quarters. Improvements in staff performance and process indicators were observed for the WHO-Nigeria polio program after implementation of a systematic accountability framework. © 2016 World Health Organization; licensee Oxford Journals.

  18. An Inside Look at a U.S. Department of Energy Impact EvaluationFramework for Deployment Programs

    Energy Technology Data Exchange (ETDEWEB)

    Vine, Edward; Jordan, Gretchen; Reed, John H.; Dowd, Jeff

    2006-04-01

    The U.S. Department of Energy's (DOE) Office of EnergyEfficiency and Renewable Energy (EERE) is developing a theory-basedapproach to impact evaluation that could be used by its deploymentprograms for evaluating energy savings and market effects with credibleattribution of impacts (DOE forthcoming). The purpose of this paper is todescribethe framework and its research design. The framework alsoprovides information for program improvement in a consistent andstructured manner. It joins Everett Rogers' diffusion of innovationtheory with logic models to examine linkages between program activities,target audiences, behavioral and institutional changes, and energysavings or adoption of cleaner energy sources. Using the framework'stemplates, a program can describe its outcome goals and program logic, aswell as identify key outcome questions and indicators (metrics).Evaluators could use the framework to understand where to look within theprogram logic for measured outcomes such as sales or adopted technologiesand practices. Finally, by using the framework a causal link between theprogram and outcomes can be tested and alternative explanationsinvestigated.

  19. A framework program for the teaching of alternative methods (replacement, reduction, refinement) to animal experimentation.

    Science.gov (United States)

    Daneshian, Mardas; Akbarsha, Mohammad A; Blaauboer, Bas; Caloni, Francesca; Cosson, Pierre; Curren, Rodger; Goldberg, Alan; Gruber, Franz; Ohl, Frauke; Pfaller, Walter; van der Valk, Jan; Vinardell, Pilar; Zurlo, Joanne; Hartung, Thomas; Leist, Marcel

    2011-01-01

    Development of improved communication and education strategies is important to make alternatives to the use of animals, and the broad range of applications of the 3Rs concept better known and understood by different audiences. For this purpose, the Center for Alternatives to Animal Testing in Europe (CAAT-Europe) together with the Transatlantic Think Tank for Toxicology (t(4)) hosted a three-day workshop on "Teaching Alternative Methods to Animal Experimentation". A compilation of the recommendations by a group of international specialists in the field is summarized in this report. Initially, the workshop participants identified the different audience groups to be addressed and also the communication media that may be used. The main outcome of the workshop was a framework for a comprehensive educational program. The modular structure of the teaching program presented here allows adaptation to different audiences with their specific needs; different time schedules can be easily accommodated on this basis. The topics cover the 3Rs principle, basic research, toxicological applications, method development and validation, regulatory aspects, case studies and ethical aspects of 3Rs approaches. This expert consortium agreed to generating teaching materials covering all modules and providing them in an open access online repository.

  20. Design and Analysis of a Service Migration Framework

    DEFF Research Database (Denmark)

    Saeed, Aamir; Olsen, Rasmus Løvenstein; Pedersen, Jens Myrup

    2013-01-01

    Users often use several heterogeneous devices such as mobile phones, PDAs, tablets, handheld devices, PC and laptops to carry out their tasks. These user devices foster a needs for tasks migration from one device to another device at runtime, making it easier for the user to continue his task...... on another device. For such a need, an architecture is proposed to design and develop applications that migrate from one device to another and resume its operation. A simple application was constructed based on the proposed framework. Experiments were carried out to demonstrate its applicability...

  1. Barriers to renewable energy penetration. A framework for analysis

    DEFF Research Database (Denmark)

    Painuly, Jyoti P.

    2001-01-01

    Renewable energy has the potential to play an important role in providing energy with sustainability to the vast populations in developing countries who as yet have no access to clean energy. Although economically viable fur several applications, renewable energy has not been able to realise its...... potential due to several barriers to its penetration. A framework has been developed in this paper to identify the barriers to renewable energy penetration acid to suggest measures to overcome them. (C) 2001 Elsevier Science Ltd. All rights reserved....

  2. A computer program for spectrochemical analysis

    International Nuclear Information System (INIS)

    Sastry, M.D.; Page, A.G.; Joshi, B.D.

    1976-01-01

    A simple and versatile computer program has been developed in FORTRAN IV for the routine analysis of the metallic impurities by emission spectrographic method. From the optical densities, the transformed transmittances, using Kaiser transformation, have been obtained such that they are linearly related to exposure. The background correction for the spectral lines has been carried out using Gauss differential logaritham method. In addition to the final analysis results in terms of the concentration of element in PPM, the advantages of the program include the printout of concentration and intensity ratios and a graphical presentation of working curves log (concentration) vs log (intensity ratio). (author)

  3. A conceptual framework for formulating a focused and cost-effective fire protection program based on analyses of risk and the dynamics of fire effects

    International Nuclear Information System (INIS)

    Dey, M.K.

    1999-01-01

    This paper proposes a conceptual framework for developing a fire protection program at nuclear power plants based on probabilistic risk analysis (PRA) of fire hazards, and modeling the dynamics of fire effects. The process for categorizing nuclear power plant fire areas based on risk is described, followed by a discussion of fire safety design methods that can be used for different areas of the plant, depending on the degree of threat to plant safety from the fire hazard. This alternative framework has the potential to make programs more cost-effective, and comprehensive, since it will allow a more systematic and broader examination of fire risk, and provide a means to distinguish between high and low risk fire contributors. (orig.)

  4. Framework for applying probabilistic safety analysis in nuclear regulation

    International Nuclear Information System (INIS)

    Dimitrijevic, V.B.

    1997-01-01

    The traditional regulatory framework has served well to assure the protection of public health and safety. It has been recognized, however, that in a few circumstances, this deterministic framework has lead to an extensive expenditure on matters hat have little to do with the safe and reliable operation of the plant. Developments of plant-specific PSA have offered a new and powerful analytical tool in the evaluation of the safety of the plant. Using PSA insights as an aid to decision making in the regulatory process is now known as 'risk-based' or 'risk-informed' regulation. Numerous activities in the U.S. nuclear industry are focusing on applying this new approach to modify regulatory requirements. In addition, other approaches to regulations are in the developmental phase and are being evaluated. One is based on the performance monitoring and results and it is known as performance-based regulation. The other, called the blended approach, combines traditional deterministic principles with PSA insights and performance results. (author)

  5. Hybrid segmentation framework for 3D medical image analysis

    Science.gov (United States)

    Chen, Ting; Metaxas, Dimitri N.

    2003-05-01

    Medical image segmentation is the process that defines the region of interest in the image volume. Classical segmentation methods such as region-based methods and boundary-based methods cannot make full use of the information provided by the image. In this paper we proposed a general hybrid framework for 3D medical image segmentation purposes. In our approach we combine the Gibbs Prior model, and the deformable model. First, Gibbs Prior models are applied onto each slice in a 3D medical image volume and the segmentation results are combined to a 3D binary masks of the object. Then we create a deformable mesh based on this 3D binary mask. The deformable model will be lead to the edge features in the volume with the help of image derived external forces. The deformable model segmentation result can be used to update the parameters for Gibbs Prior models. These methods will then work recursively to reach a global segmentation solution. The hybrid segmentation framework has been applied to images with the objective of lung, heart, colon, jaw, tumor, and brain. The experimental data includes MRI (T1, T2, PD), CT, X-ray, Ultra-Sound images. High quality results are achieved with relatively efficient time cost. We also did validation work using expert manual segmentation as the ground truth. The result shows that the hybrid segmentation may have further clinical use.

  6. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  7. A framework for understanding international medical graduate challenges during transition into fellowship programs.

    Science.gov (United States)

    Sockalingam, Sanjeev; Khan, Attia; Tan, Adrienne; Hawa, Raed; Abbey, Susan; Jackson, Timothy; Zaretsky, Ari; Okrainec, Allan

    2014-01-01

    Previous studies have highlighted unique needs of international medical graduates (IMG) during their transition into medical training programs; however, limited data exist on IMG needs specific to fellowship training. We conducted the following mixed-method study to determine IMG fellow training needs during the transition into fellowship training programs in psychiatry and surgery. The authors conducted a mixed-methods study consisting of an online survey of IMG fellows and their supervisors in psychiatry or surgery fellowship training programs and individual interviews of IMG fellows. The survey assessed (a) fellows' and supervisors' perceptions on IMG challenges in clinical communication, health systems, and education domains and (b) past orientation initiatives. In the second phase of the study, IMG fellows were interviewed during the latter half of their fellowship training, and perceptions regarding orientation and adaptation to fellowship in Canada were assessed. Survey data were analyzed using descriptive and Mann-Whitney U statistics. Qualitative interviews were analyzed using grounded theory methodology. The survey response rate was 76% (35/46) and 69% (35/51) for IMG fellows and supervisors, respectively. Fellows reported the greatest difficulty with adapting to the hospital system, medical documentation, and balancing one's professional and personal life. Supervisors believed that fellows had the greatest difficulty with managing language and slang in Canada, the healthcare system, and an interprofessional team. In Phase 2, fellows generated themes of disorientation, disconnection, interprofessional team challenges, a need for IMG fellow resources, and a benefit from training in a multicultural setting. Our study results highlight the need for IMG specific orientation resources for fellows and supervisors. Maslow's Hierarchy of Needs may be a useful framework for understanding IMG training needs.

  8. Wolf Creek quality trend analysis program

    International Nuclear Information System (INIS)

    Rudolph, W.J. II; Lindsay, W.M.

    1987-01-01

    The Wolf Creek quality trend analysis program has been designed with three primary objectives in mind: (1) to provide a statistically relevant diagnostic and trend identification tool to improve plant availability and reliability; (2) to communicate clearly and concisely need-to-know information to management personnel; and (3) to provide an additional method of obtaining corrective actions to significant quality issues. The analysis methodology uses a relatively sophisticated computer program to continuously evaluate a large data base of current, significant problems. The evaluation process groups similar problems according to their alphanumeric codes and highlights these problems whenever they exceed an established statistical control limit. A root cause analysis is performed by quality department personnel who then combine the various computer-generated graphical summaries into a short, concise trend analysis report. Other essential features of the program include measures for following identified adverse trends and implementing formal corrective actions when necessary. The results of diagnostic and trend analysis graphical summaries are considered important additions to the corrective action program at Wolf Creek. The report provides all levels of management with concise and easily interpreted information concerning quality indicators and trends

  9. Conceptual risk assessment framework for global change risk analysis SRP

    CSIR Research Space (South Africa)

    Elphinstone, CD

    2007-12-01

    Full Text Available This report is submitted as a deliverable of the SRP project Global Change Risk Analysis which aims at applying risk analysis as a unifying notion for quantifying and communicating threats to ecosystem services originating from global change...

  10. A Demonstrative Analysis of News Articles Using Fairclough’s Critical Discourse Analysis Framework

    Directory of Open Access Journals (Sweden)

    Roy Randy Y. Briones

    2017-07-01

    Full Text Available This paper attempts to demonstrate Norman Fairclough’s Critical Discourse Analysis (CDA framework by conducting internal and external level analyses on two online news articles that report on the Moro Islamic Liberation Front’s (MILF submission of its findings on the “Mamasapano Incident” that happened in the Philippines in 2015. In performing analyses using this framework, the social context and background for these texts, as well as the relationship between the internal discourse features and the external social practices and structures in which the texts were produced are thoroughly examined. As a result, it can be noted that from the texts’ internal discourse features, the news articles portray ideological and social distinctions among social actors such as the Philippine Senate, the SAF troopers, the MILF, the MILF fighters, and the civilians. Moreover, from the viewpoint of the texts as being external social practices, the texts maintain institutional identities as news reports, but they also reveal some evaluative stance as exemplified by the adjectival phrases that the writers employed. Having both the internal and external features examined, it can be said that the way these texts were written seems to portray power relations that exist between the Philippine government and the MILF. Key words: Critical Discourse Analysis, discourse analysis, news articles, social practices, social structures, power relations

  11. Strengths, Weaknesses, Opportunities and Threats: A SWOT analysis of the ecosystem services framework

    CSIR Research Space (South Africa)

    Bull, JW

    2016-02-01

    Full Text Available –Weaknesses–Opportunities–Threats (SWOT) analysis of ES through YESS member surveys. Strengths include the approach being interdisciplinary, and a useful communication tool. Weaknesses include an incomplete scientific basis, frameworks being inconsistently applied, and accounting...

  12. A unified framework of descent algorithms for nonlinear programs and variational inequalities

    International Nuclear Information System (INIS)

    Patriksson, M.

    1993-01-01

    We present a framework of algorithms for the solution of continuous optimization and variational inequality problems. In the general algorithm, a search direction finding auxiliary problems is obtained by replacing the original cost function with an approximating monotone cost function. The proposed framework encompasses algorithm classes presented earlier by Cohen, Dafermos, Migdalas, and Tseng, and includes numerous descent and successive approximation type methods, such as Newton methods, Jacobi and Gauss-Siedel type decomposition methods for problems defined over Cartesian product sets, and proximal point methods, among others. The auxiliary problem of the general algorithm also induces equivalent optimization reformulation and descent methods for asymmetric variational inequalities. We study the convergence properties of the general algorithm when applied to unconstrained optimization, nondifferentiable optimization, constrained differentiable optimization, and variational inequalities; the emphasis of the convergence analyses is placed on basic convergence results, convergence using different line search strategies and truncated subproblem solutions, and convergence rate results. This analysis offer a unification of known results; moreover, it provides strengthenings of convergence results for many existing algorithms, and indicates possible improvements of their realizations. 482 refs

  13. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1985-01-01

    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  14. Asthma education program for First Nations children: an exemplar of the knowledge-to-action framework.

    Science.gov (United States)

    Douglas, Maureen L; McGhan, Shawna L; Tougas, Danielle; Fenton, Nancy; Sarin, Christopher; Latycheva, Oxana; Befus, A Dean

    2013-01-01

    The prevalence of asthma in Aboriginal children is 6% to 14%. Gaps in knowledge regarding asthma and its management exist in First Nations (FN) communities, and culturally relevant education and resources are required. Studies have recommended that the children's asthma education program, the 'Roaring Adventures of Puff', be modified through partnership with FN communities to be culturally appropriate. To adapt this knowledge tool and design an effective implementation process for FN knowledge users (children with asthma and care providers), guided by the Canadian Institutes of Health Research knowledge translation framework. The problem was identified, knowledge was identified⁄reviewed⁄selected (literature review); knowledge was adapted to the local context (FN working and advisory groups); barriers to knowledge use were assessed (by knowledge users); and interventions were selected, tailored and implemented (modified curricula and the creation of a new activity book and web-based resources, and regional coordinators, asthma educator mentors and community teams were recruited). Major outcomes were the adapted tools and blueprints for tailoring implementation. Additional outcomes were preliminary observations and outputs from the iterative processes, including information about local context and barriers. Specific additions were roles for community members supported by asthma educators (applying FN teaching models and addressing health care demands); relevant triggers (addressing knowledge gaps); and FN images and stories, themes of circle, sacred teachings, nature and family⁄elders (culture and addressing low reading levels). The framework model provides a logical, valuable tool for adapting a knowledge tool and implementation process to new knowledge users. Future research should measure uptake, effect on health outcomes of FN asthma sufferers and sustainability.

  15. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR).

    Science.gov (United States)

    Damschroder, Laura J; Lowery, Julie C

    2013-05-10

    In the United States, as in many other parts of the world, the prevalence of overweight/obesity is at epidemic proportions in the adult population and even higher among Veterans. To address the high prevalence of overweight/obesity among Veterans, the MOVE!(®) weight management program was disseminated nationally to Veteran Affairs (VA) medical centers. The objective of this paper is two-fold: to describe factors that explain the wide variation in implementation of MOVE!; and to illustrate, step-by-step, how to apply a theory-based framework using qualitative data. Five VA facilities were selected to maximize variation in implementation effectiveness and geographic location. Twenty-four key stakeholders were interviewed about their experiences in implementing MOVE!. The Consolidated Framework for Implementation Research (CFIR) was used to guide collection and analysis of qualitative data. Constructs that most strongly influence implementation effectiveness were identified through a cross-case comparison of ratings. Of the 31 CFIR constructs assessed, ten constructs strongly distinguished between facilities with low versus high program implementation effectiveness. The majority (six) were related to the inner setting: networks and communications; tension for change; relative priority; goals and feedback; learning climate; and leadership engagement. One construct each, from intervention characteristics (relative advantage) and outer setting (patient needs and resources), plus two from process (executing and reflecting) also strongly distinguished between high and low implementation. Two additional constructs weakly distinguished, 16 were mixed, three constructs had insufficient data to assess, and one was not applicable. Detailed descriptions of how each distinguishing construct manifested in study facilities and a table of recommendations is provided. This paper presents an approach for using the CFIR to code and rate qualitative data in a way that will facilitate

  16. SIMS analysis: Development and evaluation program summary

    International Nuclear Information System (INIS)

    Groenewold, G.S.; Appelhans, A.D.; Ingram, J.C.; Delmore, J.E.; Dahl, D.A.

    1996-11-01

    This report provides an overview of the ''SIMS Analysis: Development and Evaluation Program'', which was executed at the Idaho National Engineering Laboratory from mid-FY-92 to the end of FY-96. It should be noted that prior to FY-1994 the name of the program was ''In-Situ SIMS Analysis''. This report will not go into exhaustive detail regarding program accomplishments, because this information is contained in annual reports which are referenced herein. In summary, the program resulted in the design and construction of an ion trap secondary ion mass spectrometer (IT-SIMS), which is capable of the rapid analysis of environmental samples for adsorbed surface contaminants. This instrument achieves efficient secondary ion desorption by use of a molecular, massive ReO 4 - primary ion particle. The instrument manages surface charge buildup using a self-discharging principle, which is compatible with the pulsed nature of the ion trap. The instrument can achieve high selectivity and sensitivity using its selective ion storage and MS/MS capability. The instrument was used for detection of tri-n-butyl phosphate, salt cake (tank cake) characterization, and toxic metal speciation studies (specifically mercury). Technology transfer was also an important component of this program. The approach that was taken toward technology transfer was that of component transfer. This resulted in transfer of data acquisition and instrument control software in FY-94, and ongoing efforts to transfer primary ion gun and detector technology to other manufacturers

  17. Learner Analysis Framework for Globalized E-Learning

    Science.gov (United States)

    Saxena, Mamta

    2010-01-01

    The digital shift to technology-mediated modes of instructional delivery and the increased global connectivity has led to the rise in globalized e-learning programs. Educational institutions face multiple challenges as they seek to design effective, engaging and culturally competent instruction for an increasingly diverse learner population. The…

  18. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  19. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  20. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  1. Economic and Nonproliferation Analysis Framework for Assessing Reliable Nuclear Fuel Service Arrangements

    International Nuclear Information System (INIS)

    Phillips, Jon R.; Kreyling, Sean J.; Short, Steven M.; Weimar, Mark R.

    2010-01-01

    Nuclear power is now broadly recognized as an essential technology in national strategies to provide energy security while meeting carbon management goals. Yet a long standing conundrum remains: how to enable rapid growth in the global nuclear power infrastructure while controlling the spread of sensitive enrichment and reprocessing technologies that lie at the heart of nuclear fuel supply and nuclear weapons programs. Reducing the latent proliferation risk posed by a broader horizontal spread of enrichment and reprocessing technology has been a primary goal of national nuclear supplier policies since the beginning of the nuclear power age. Attempts to control the spread of sensitive nuclear technology have been the subject of numerous initiatives in the intervening decades sometimes taking the form of calls to develop fuel supply and service assurances to reduce market pull to increase the number of states with fuel cycle capabilities. A clear understanding of what characteristics of specific reliable nuclear fuel service (RNFS) and supply arrangements qualify them as 'attractive offers' is critical to the success of current and future efforts. At a minimum, RNFS arrangements should provide economic value to all participants and help reduce latent proliferation risks posed by the global expansion of nuclear power. In order to inform the technical debate and the development of policy, Pacific Northwest National Laboratory has been developing an analytical framework to evaluate the economics and nonproliferation merits of alternative approaches to RNFS arrangements. This paper provides a brief overview of the economic analysis framework developed and applied to a model problem of current interest: full-service nuclear fuel leasing arrangements. Furthermore, this paper presents an extended outline of a proposed analysis approach to evaluate the non-proliferation merits of various RNFS alternatives.

  2. Damage analysis and fundamental studies program

    International Nuclear Information System (INIS)

    Doran, D.G.; Farrar, H. IV; Goland, A.N.

    1978-01-01

    The Damage Analysis and Fundamental Studies (DAFS) Task Group has been formed by the Office of Fusion Energy to develop procedures for applying data obtained in various irradiation test facilities to projected fusion environments. A long-range program plan has been prepared and implementation has begun. The plan and technical status are briefly described

  3. Counter Trafficking System Development "Analysis Training Program"

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Dennis C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-12-01

    This document will detail the training curriculum for the Counter-Trafficking System Development (CTSD) Analysis Modules and Lesson Plans are derived from the United States Military, Department of Energy doctrine and Lawrence Livermore National Laboratory (LLNL), Global Security (GS) S Program.

  4. An in-depth analysis of theoretical frameworks for the study of care coordination

    Directory of Open Access Journals (Sweden)

    Sabine Van Houdt

    2013-06-01

    Full Text Available Introduction: Complex chronic conditions often require long-term care from various healthcare professionals. Thus, maintaining quality care requires care coordination. Concepts for the study of care coordination require clarification to develop, study and evaluate coordination strategies. In 2007, the Agency for Healthcare Research and Quality defined care coordination and proposed five theoretical frameworks for exploring care coordination. This study aimed to update current theoretical frameworks and clarify key concepts related to care coordination. Methods: We performed a literature review to update existing theoretical frameworks. An in-depth analysis of these theoretical frameworks was conducted to formulate key concepts related to care coordination.Results: Our literature review found seven previously unidentified theoretical frameworks for studying care coordination. The in-depth analysis identified fourteen key concepts that the theoretical frameworks addressed. These were ‘external factors’, ‘structure’, ‘tasks characteristics’, ‘cultural factors’, ‘knowledge and technology’, ‘need for coordination’, ‘administrative operational processes’, ‘exchange of information’, ‘goals’, ‘roles’, ‘quality of relationship’, ‘patient outcome’, ‘team outcome’, and ‘(interorganizational outcome’.Conclusion: These 14 interrelated key concepts provide a base to develop or choose a framework for studying care coordination. The relational coordination theory and the multi-level framework are interesting as these are the most comprehensive.

  5. Value Frameworks in Oncology: Comparative Analysis and Implications to the Pharmaceutical Industry.

    Science.gov (United States)

    Slomiany, Mark; Madhavan, Priya; Kuehn, Michael; Richardson, Sasha

    2017-07-01

    As the cost of oncology care continues to rise, composite value models that variably capture the diverse concerns of patients, physicians, payers, policymakers, and the pharmaceutical industry have begun to take shape. To review the capabilities and limitations of 5 of the most notable value frameworks in oncology that have emerged in recent years and to compare their relative value and application among the intended stakeholders. We compared the methodology of the American Society of Clinical Oncology (ASCO) Value Framework (version 2.0), the National Comprehensive Cancer Network Evidence Blocks, Memorial Sloan Kettering Cancer Center DrugAbacus, the Institute for Clinical and Economic Review Value Assessment Framework, and the European Society for Medical Oncology Magnitude of Clinical Benefit Scale, using a side-by-side comparative approach in terms of the input, scoring methodology, and output of each framework. In addition, we gleaned stakeholder insights about these frameworks and their potential real-world applications through dialogues with physicians and payers, as well as through secondary research and an aggregate analysis of previously published survey results. The analysis identified several framework-specific themes in their respective focus on clinical trial elements, breadth of evidence, evidence weighting, scoring methodology, and value to stakeholders. Our dialogues with physicians and our aggregate analysis of previous surveys revealed a varying level of awareness of, and use of, each of the value frameworks in clinical practice. For example, although the ASCO Value Framework appears nascent in clinical practice, physicians believe that the frameworks will be more useful in practice in the future as they become more established and as their outputs are more widely accepted. Along with patients and payers, who bear the burden of treatment costs, physicians and policymakers have waded into the discussion of defining value in oncology care, as well

  6. 7 CFR 1700.32 - Program Accounting and Regulatory Analysis.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false Program Accounting and Regulatory Analysis. 1700.32... Accounting and Regulatory Analysis. RUS, through Program Accounting and Regulatory Analysis, monitors and... Assistant Administrator, Program Accounting and Regulatory Analysis, directs and coordinates program...

  7. A Framework for Examining Mathematics Teacher Knowledge as Used in Error Analysis

    Science.gov (United States)

    Peng, Aihui; Luo, Zengru

    2009-01-01

    Error analysis is a basic and important task for mathematics teachers. Unfortunately, in the present literature there is a lack of detailed understanding about teacher knowledge as used in it. Based on a synthesis of the literature in error analysis, a framework for prescribing and assessing mathematics teacher knowledge in error analysis was…

  8. A novel joint analysis framework improves identification of differentially expressed genes in cross disease transcriptomic analysis

    Directory of Open Access Journals (Sweden)

    Wenyi Qin

    2018-02-01

    Full Text Available Abstract Motivation Detecting differentially expressed (DE genes between disease and normal control group is one of the most common analyses in genome-wide transcriptomic data. Since most studies don’t have a lot of samples, researchers have used meta-analysis to group different datasets for the same disease. Even then, in many cases the statistical power is still not enough. Taking into account the fact that many diseases share the same disease genes, it is desirable to design a statistical framework that can identify diseases’ common and specific DE genes simultaneously to improve the identification power. Results We developed a novel empirical Bayes based mixture model to identify DE genes in specific study by leveraging the shared information across multiple different disease expression data sets. The effectiveness of joint analysis was demonstrated through comprehensive simulation studies and two real data applications. The simulation results showed that our method consistently outperformed single data set analysis and two other meta-analysis methods in identification power. In real data analysis, overall our method demonstrated better identification power in detecting DE genes and prioritized more disease related genes and disease related pathways than single data set analysis. Over 150% more disease related genes are identified by our method in application to Huntington’s disease. We expect that our method would provide researchers a new way of utilizing available data sets from different diseases when sample size of the focused disease is limited.

  9. CREATION OF IT-ORIENTED ONTOLOGICAL FRAMEWORK FOR THE PURPOSE OF MAKING EDUCATIONAL PROGRAMS ON THE BASE OF COMPETENCIES

    Directory of Open Access Journals (Sweden)

    G. M. Korotenko

    2017-08-01

    Full Text Available Purpose. Taking into account the expansion of computing application scopes there is a need to identify the links and features of the constantly emerging professional competencies of the new sections of computing knowledge to improve the process of forming new curricula. Methodology. Authors propose the new approach aimed to build specialized knowledge bases generated using artificial intelligence technology and focused on the use of multiple heterogeneous resources or data sources on specific educational topics is proposed. As a tool, ensuring the formation of the base ontology the Protégé 4.2 ontology editor is used. As one of the modules of the developed system of semantic analysis, which provides access to ontology and the possibility of its processing, the Apache Jena Java framework should be used, which forms the software environment for working with data in RDF, RDFS and OWL formats, and also supports the ability to form queries to Ontologies in the SPARQL language. The peculiarity of this approach is the binding of information resources of the three-platform presentation of the disciplinary structure in the context of identifying the links of professional competencies. Findings. The model and structure of the IT-oriented ontological framework designed to ensure the components convergence of the university three-platform information and communication environment are developed. The structure of the knowledge base ontology-basis, describing the main essence of the educational standards of the "Information Technologies" branch is formed. Originality. Within the framework of design and formation of the knowledge sector disciplinary structure "Information Technologies" in the context of the competence approach to education, the architecture of the competence descriptors of semantic analysis system is proposed. It implements the algorithm for integrating the ontological and product models of knowledge representation about the subject domain

  10. Using Campinha-Bacote's Framework to Examine Cultural Competence from an Interdisciplinary International Service Learning Program

    Science.gov (United States)

    Wall-Bassett, Elizabeth DeVane; Hegde, Archana Vasudeva; Craft, Katelyn; Oberlin, Amber Louise

    2018-01-01

    The purpose of this study was to investigate an interdisciplinary international service learning program and its impact on student sense of cultural awareness and competence using the Campinha-Bacote's (2002) framework of cultural competency model. Seven undergraduate and one graduate student from Human Development and Nutrition Science…

  11. A framework for improving the cost-effectiveness of DSM program evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Sonnenblick, R.; Eto, J.

    1995-09-01

    The prudence of utility demand-side management (DSM) investments hinges on their performance, yet evaluating performance is complicated because the energy saved by DSM programs can never be observed directly but only inferred. This study frames and begins to answer the following questions: (1) how well do current evaluation methods perform in improving confidence in the measurement of energy savings produced by DSM programs; (2) in view of this performance, how can limited evaluation resources be best allocated to maximize the value of the information they provide? The authors review three major classes of methods for estimating annual energy savings: tracking database (sometimes called engineering estimates), end-use metering, and billing analysis and examine them in light of the uncertainties in current estimates of DSM program measure lifetimes. The authors assess the accuracy and precision of each method and construct trade-off curves to examine the costs of increases in accuracy or precision. Several approaches for improving evaluations for the purpose of assessing program cost effectiveness are demonstrated. The methods can be easily generalized to other evaluation objectives, such as shared savings incentive payments.

  12. A Framework for the Cognitive Task Analysis in Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    he present rapid development of advanced information technology and its use for support of operators of complex technical systems are changing the content of task analysis towards the analysis of mental activities in decision making. Automation removes the humans from routine tasks, and operators...... are left with disturbance control and critical diagnostic tasks, for which computers are suitable for support, if it is possible to match the computer strategies and interface formats dynamically to the requirements of the current task by means of an analysis of the cognitive task....

  13. The Macroeconomic Framework of Support Analysis for Sustainable Businesses Development

    Directory of Open Access Journals (Sweden)

    Constantin Mitrut

    2015-08-01

    Full Text Available The state of satisfaction of an economy results from the quality of the economic products it produces and consumes, in agreement with assuring environment protection, as a source of producing present and future economic goods, and with intensive utilising of human capital, as a source of innovation growth. Knowledge transfer happens in a sustainable economy, whose principles are rational use of resources, limiting of waste, protection, for enabling future generations to have also access to resources. The present research is based on a multifactorial liniar regression model which outlines the direct correlation between the dependent variable welfare and the independent variable of concentration measured by the Gini coefficient of wealth concentration, on the one hand, and by the GDP level, on the other hand, at the level of year 2012. The aim of this research is to identify the correlation between the indicator of quality of life satisfaction or of the welfare function at the level of EU 2012, and the assurance of a macroeconomic framework for sustainable business development.

  14. GEOPOLITICS - A NEW FRAMEWORK OF ANALYSIS: GLOBAL CHALLENGES AND PERSPECTIVES

    Directory of Open Access Journals (Sweden)

    Laura Cătălina PAȘCU

    2015-04-01

    Full Text Available Geopolitics born at the end of the 19th century and reborn at the end of the 20th century, from the need to explain certain issues arising out of the general evolution of human society and the growing influence of permanent politico-economic changes on the human consciousness and the entire system of socio-political life and culture. Also, geopolitics gives us the opportunity to reflect on the manifestation and evolution of power relations within a particular historical period, to assess and track changes and trends in the current system of international relations, giving us indicators and analytical methods about the reality of international relations. The relations of competition versus cooperation between international actors have changed gradually in the 21st century; however, development cooperation can provide solutions or opportunities for defining global problems. Globalization creates a new framework in security and international relations also introduced the geo-economics perspectives to assumed new found geopolitical importance at the outset of the twenty-first century.

  15. Drainage network extraction from a high-resolution DEM using parallel programming in the .NET Framework

    Science.gov (United States)

    Du, Chao; Ye, Aizhong; Gan, Yanjun; You, Jinjun; Duan, Qinyun; Ma, Feng; Hou, Jingwen

    2017-12-01

    High-resolution Digital Elevation Models (DEMs) can be used to extract high-accuracy prerequisite drainage networks. A higher resolution represents a larger number of grids. With an increase in the number of grids, the flow direction determination will require substantial computer resources and computing time. Parallel computing is a feasible method with which to resolve this problem. In this paper, we proposed a parallel programming method within the .NET Framework with a C# Compiler in a Windows environment. The basin is divided into sub-basins, and subsequently the different sub-basins operate on multiple threads concurrently to calculate flow directions. The method was applied to calculate the flow direction of the Yellow River basin from 3 arc-second resolution SRTM DEM. Drainage networks were extracted and compared with HydroSHEDS river network to assess their accuracy. The results demonstrate that this method can calculate the flow direction from high-resolution DEMs efficiently and extract high-precision continuous drainage networks.

  16. A Functional Analysis Framework for Modeling, Estimation and Control in Science and Engineering

    CERN Document Server

    Banks, HT

    2012-01-01

    A Modern Framework Based on Time-Tested Material A Functional Analysis Framework for Modeling, Estimation and Control in Science and Engineering presents functional analysis as a tool for understanding and treating distributed parameter systems. Drawing on his extensive research and teaching from the past 20 years, the author explains how functional analysis can be the basis of modern partial differential equation (PDE) and delay differential equation (DDE) techniques. Recent Examples of Functional Analysis in Biology, Electromagnetics, Materials, and Mechanics Through numerous application exa

  17. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  18. CRBLASTER: A Parallel-Processing Computational Framework for Embarrassingly-Parallel Image-Analysis Algorithms

    Science.gov (United States)

    Mighell, Kenneth John

    2011-11-01

    The development of parallel-processing image-analysis codes is generally a challenging task that requires complicated choreography of interprocessor communications. If, however, the image-analysis algorithm is embarrassingly parallel, then the development of a parallel-processing implementation of that algorithm can be a much easier task to accomplish because, by definition, there is little need for communication between the compute processes. I describe the design, implementation, and performance of a parallel-processing image-analysis application, called CRBLASTER, which does cosmic-ray rejection of CCD (charge-coupled device) images using the embarrassingly-parallel L.A.COSMIC algorithm. CRBLASTER is written in C using the high-performance computing industry standard Message Passing Interface (MPI) library. The code has been designed to be used by research scientists who are familiar with C as a parallel-processing computational framework that enables the easy development of parallel-processing image-analysis programs based on embarrassingly-parallel algorithms. The CRBLASTER source code is freely available at the official application website at the National Optical Astronomy Observatory. Removing cosmic rays from a single 800x800 pixel Hubble Space Telescope WFPC2 image takes 44 seconds with the IRAF script lacos_im.cl running on a single core of an Apple Mac Pro computer with two 2.8-GHz quad-core Intel Xeon processors. CRBLASTER is 7.4 times faster processing the same image on a single core on the same machine. Processing the same image with CRBLASTER simultaneously on all 8 cores of the same machine takes 0.875 seconds - which is a speedup factor of 50.3 times faster than the IRAF script. A detailed analysis is presented of the performance of CRBLASTER using between 1 and 57 processors on a low-power Tilera 700-MHz 64-core TILE64 processor.

  19. A Program Transformation for Backwards Analysis of Logic Programs

    DEFF Research Database (Denmark)

    Gallagher, John Patrick

    2003-01-01

    programs presented here is based on a transformation of the input program, which makes explicit the dependencies of the given program points on the initial goals. The transformation is derived from the resultants semantics of logic programs. The transformed program is then analysed using a standard...

  20. Energy pathway analysis - a hydrogen fuel cycle framework for system studies

    International Nuclear Information System (INIS)

    Badin, J.S.; Tagore, S.

    1997-01-01

    An analytical framework has been developed that can be used to estimate a range of life-cycle costs and impacts that result from the incremental production, storage, transport, and use of different fuels or energy carriers, such as hydrogen, electricity, natural gas, and gasoline. This information is used in a comparative analysis of energy pathways. The pathways provide the U.S. Department of Energy (DOE) with an indication of near-, mid-, and long-term technologies that have the greatest potential for advancement and can meet the cost goals. The methodology and conceptual issues are discussed. Also presented are results for selected pathways from the E3 (Energy, Economics, Emissions) Pathway Analysis Model. This model will be expanded to consider networks of pathways and to be compatible with a linear programming optimization processor. Scenarios and sets of constraints (energy demands, sources, emissions) will be defined so the effects on energy transformation activities included in the solution and on the total optimized system cost can be investigated. This evaluation will be used as a guide to eliminate technically feasible pathways if they are not cost effective or do not meet the threshold requirements for the market acceptance. (Author)

  1. A framework for performing workplace hazard and risk analysis: a participative ergonomics approach.

    Science.gov (United States)

    Morag, Ido; Luria, Gil

    2013-01-01

    Despite the unanimity among researchers about the centrality of workplace analysis based on participatory ergonomics (PE) as a basis for preventive interventions, there is still little agreement about the necessary of a theoretical framework for providing practical guidance. In an effort to develop a conceptual PE framework, the authors, focusing on 20 studies, found five primary dimensions for characterising an analytical structure: (1) extent of workforce involvement; (2) analysis duration; (3) diversity of reporter role types; (4) scope of analysis and (5) supportive information system for analysis management. An ergonomics analysis carried out in a chemical manufacturing plant serves as a case study for evaluating the proposed framework. The study simultaneously demonstrates the five dimensions and evaluates their feasibility. The study showed that managerial leadership was fundamental to the successful implementation of the analysis; that all job holders should participate in analysing their own workplace and simplified reporting methods contributed to a desirable outcome. This paper seeks to clarify the scope of workplace ergonomics analysis by offering a theoretical and structured framework for providing practical advice and guidance. Essential to successfully implementing the analytical framework are managerial involvement, participation of all job holders and simplified reporting methods.

  2. Needs Analysis and Course Design; A Framework for Designing Exam Courses

    Directory of Open Access Journals (Sweden)

    Reza Eshtehardi

    2017-09-01

    Full Text Available This paper introduces a framework for designing exam courses and highlights the importance of needs analysis in designing exam courses. The main objectives of this paper are to highlight the key role of needs analysis in designing exam courses, to offer a framework for designing exam courses, to show the language needs of different students for IELTS (International English Language Testing System exam, to offer an analysis of those needs and to explain how they will be taken into account for the design of the course. First, I will concentrate on some distinguishing features in exam classes, which make them different from general English classes. Secondly, I will introduce a framework for needs analysis and diagnostic testing and highlight the importance of needs analysis for the design of syllabus and language courses. Thirdly, I will describe significant features of syllabus design, course assessment, and evaluation procedures.

  3. Conceptual Framework for Gentrification Analysis of Iskandar Malaysia

    Directory of Open Access Journals (Sweden)

    Rabiyatul Adawiyah Abd Khalil

    2015-05-01

    Full Text Available Gentrification is generally defined as the transformation of a working class living in the central city into middle-upper class society. It has both positive and negative consequences. Gentrification caused loses of affordable home, however, it is also beneficial because it rejuvenates the tax base as well stimulates mixed income. Question arises whether the characteristics of gentrification in developing countries will appear to be the same or varies to those in developed countries. Because of this research growth, a review of the body of literature related to the mutation of gentrification, i.e. type of gentrification and its characteristics is believed necessary. This will serve as a basis for a conceptual framework to analyze what is happening in Iskandar Malaysia (IM. As globalized urbanization area, IM offers a particularly interesting case as there are already signs of gentrification due to its rapid urbanization. In the residential market, house price in IM shows a rapid and continuous increment. Many foreigners are attracted to the new residential area in IM being promoted as exclusive while promising a quality lifestyle. The locals meanwhile face difficulties in owning a home because of the upward spiraling of house price.  In certain area, the local low income people are displaced by middle and upper income group. The identification of such characteristics and the associated attributes which is the second phase of the study will determine to what extent IM is in the process of gentrification. The paper finally concluded that the sign of gentrification in IM is similar to the other developing countries.

  4. EU Science Diplomacy and Framework Programs as Instruments of STI Cooperation

    Directory of Open Access Journals (Sweden)

    К. А. Ibragimova

    2017-01-01

    Full Text Available This article examines the tools that the EU in interactions with third countries in the field of STI uses. The EU is a pioneer in the use of science and technology in the international arena, the creation of strategic bilateral agreements on science and technology and the conduct of political dialogues at the highest political level (at the country and regional levels. The EU actively uses its foreign policy instruments of influence, including the provision of access to its framework programs to researchers from third countries, as well as scientific diplomacy. The success of these programs and scientific diplomacy shows the effectiveness of the EU as a global actor. In its foreign policy global innovation strategy, the EU proceeds from the premise that no state in the world today can cope independently with modern global challenges such as climate change, migration, terrorism, etc. Therefore, the solution of these issues requires both an expert evaluation from an independent world scientific community, and the perseverance of diplomats and officials of branch ministries of national states capable of conveying the views of their government in international negotiations and defending national interests of the country to find a solution that suits everyone. The EU has the resources to create a "cumulative effect" by developing and applying common norms on the territory of theUnion, analyzing the innovation policies of member states and the possibility of sharing best practices. At the same time, the EU shares its vision of problems, values and priorities with partners and uses the tools of "soft power" (including its smart and normative force and scientific diplomacy in the field of STI. The soft power of the EU in the field of STI lies in the attractiveness of the EU as a research area in which it is possible to conduct modern high-quality international research with the involvement of scientific teams from different countries in both physical

  5. The social impacts of dams: A new framework for scholarly analysis

    International Nuclear Information System (INIS)

    Kirchherr, Julian; Charles, Katrina J.

    2016-01-01

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  6. The social impacts of dams: A new framework for scholarly analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kirchherr, Julian, E-mail: julian.kirchherr@sant.ox.ac.uk; Charles, Katrina J., E-mail: katrina.charles@ouce.ox.ac.uk

    2016-09-15

    No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omit key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.

  7. A framework for biodynamic feedthrough analysis--part I: theoretical foundations.

    Science.gov (United States)

    Venrooij, Joost; van Paassen, Marinus M; Mulder, Mark; Abbink, David A; Mulder, Max; van der Helm, Frans C T; Bulthoff, Heinrich H

    2014-09-01

    Biodynamic feedthrough (BDFT) is a complex phenomenon, which has been studied for several decades. However, there is little consensus on how to approach the BDFT problem in terms of definitions, nomenclature, and mathematical descriptions. In this paper, a framework for biodynamic feedthrough analysis is presented. The goal of this framework is two-fold. First, it provides some common ground between the seemingly large range of different approaches existing in the BDFT literature. Second, the framework itself allows for gaining new insights into BDFT phenomena. It will be shown how relevant signals can be obtained from measurement, how different BDFT dynamics can be derived from them, and how these different dynamics are related. Using the framework, BDFT can be dissected into several dynamical relationships, each relevant in understanding BDFT phenomena in more detail. The presentation of the BDFT framework is divided into two parts. This paper, Part I, addresses the theoretical foundations of the framework. Part II, which is also published in this issue, addresses the validation of the framework. The work is presented in two separate papers to allow for a detailed discussion of both the framework's theoretical background and its validation.

  8. Continuous quality improvement in a Maltese hospital using logical framework analysis.

    Science.gov (United States)

    Buttigieg, Sandra C; Gauci, Dorothy; Dey, Prasanta

    2016-10-10

    Purpose The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.

  9. A framework for smartphone-enabled, patient-generated health data analysis

    Directory of Open Access Journals (Sweden)

    Shreya S. Gollamudi

    2016-08-01

    Full Text Available Background: Digital medicine and smartphone-enabled health technologies provide a novel source of human health and human biology data. However, in part due to its intricacies, few methods have been established to analyze and interpret data in this domain. We previously conducted a six-month interventional trial examining the efficacy of a comprehensive smartphone-based health monitoring program for individuals with chronic disease. This included 38 individuals with hypertension who recorded 6,290 blood pressure readings over the trial. Methods: In the present study, we provide a hypothesis testing framework for unstructured time series data, typical of patient-generated mobile device data. We used a mixed model approach for unequally spaced repeated measures using autoregressive and generalized autoregressive models, and applied this to the blood pressure data generated in this trial. Results: We were able to detect, roughly, a 2 mmHg decrease in both systolic and diastolic blood pressure over the course of the trial despite considerable intra- and inter-individual variation. Furthermore, by supplementing this finding by using a sequential analysis approach, we observed this result over three months prior to the official study end—highlighting the effectiveness of leveraging the digital nature of this data source to form timely conclusions. Conclusions: Health data generated through the use of smartphones and other mobile devices allow individuals the opportunity to make informed health decisions, and provide researchers the opportunity to address innovative health and biology questions. The hypothesis testing framework we present can be applied in future studies utilizing digital medicine technology or implemented in the technology itself to support the quantified self.

  10. The 7 th framework program of the EU; 7 Programa Marco de I+D de la Union Europea

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, E. M.; Serrano, J. A.

    2007-07-01

    The framework Program is the principal community initiative for fostering and supporting R and D in the European Union. its main goal is to improve competitiveness by fundamentally financing research, technological development, demonstration and innovation activities through transnational collaboration between research institutes and firms belong to both the European Union countries and States affiliated as third countries. In addition, it provides financial support to enhancement and coordination of European research infrastructures, promotion and training of research personnel, basic research and, particularly as of the current 7th Framework Program, coordination of national R and D programs and impllementation of European technology platforms (PTEs), which have been conveived to promote strategic research agendas in key sectors with the cooperation of all the involved players. In the wake of the PTEs, different national platforms have been implemented at the national level which are very active in different sectors. (Authors)

  11. A FRAMEWORK ANALYSIS OF EUROPEAN LABOUR MARKET POLICIES

    Directory of Open Access Journals (Sweden)

    Graţiela Georgiana Carica

    2011-03-01

    Full Text Available The purpose of the paper is to analyse European labour market policies and their integrated guidelines, by highlighting various measures that need to be adopted in order to increase labour productivity, with positive effects on long-term economic development. The paper methodizes the main conditions complied by structural reforms in order to encourage labour employment and the policies that frame a more efficient unemployment insurance system crucial to increase security while encouraging the unemployed to look for a job and to accept a job offer, respectively on flexicurity policies. We found that employment rates are generally associated with large expenses on labour market policies and with an increased number of participants to programs developed within these types of policies. The degree of influence and strong dependence between outcome and labour market policies are illustrated in various ways and discussed within the paper.

  12. A framework for the nationwide multimode transportation demand analysis.

    Science.gov (United States)

    2010-09-01

    This study attempts to analyze the impact of traffic on the US highway system considering both passenger vehicles and : trucks. For the analysis, a pseudo-dynamic traffic assignment model is proposed to estimate the time-dependent link flow : from th...

  13. Value Chain Analysis: A Framework for Management of Distance Education.

    Science.gov (United States)

    Woudstra, Andrew; Powell, Richard

    1989-01-01

    Discussion of the benefits of value chain analysis in the management of distance education organizations focuses on an example at Athabasca University. The effects of policies and decisions on the organization and its value system are considered, cost drivers for activities are described, and a future-oriented perspective is emphasized. (14…

  14. Muon g-2 Reconstruction and Analysis Framework for the Muon Anomalous Precession Frequency

    Energy Technology Data Exchange (ETDEWEB)

    Khaw, Kim Siang [Washington U., Seattle

    2017-10-21

    The Muon g-2 experiment at Fermilab, with the aim to measure the muon anomalous magnetic moment to an unprecedented level of 140~ppb, has started beam and detector commissioning in Summer 2017. To deal with incoming data projected to be around tens of petabytes, a robust data reconstruction and analysis chain based on Fermilab's \\textit{art} event-processing framework is developed. Herein, I report the current status of the framework, together with its novel features such as multi-threaded algorithms for online data quality monitor (DQM) and fast-turnaround operation (nearline). Performance of the framework during the commissioning run is also discussed.

  15. A Conceptual Framework for the Analysis of Risk and Problem Behaviors: The Case of Adolescent Sexual Behavior

    Science.gov (United States)

    Guilamo-Ramos; Vincent; Jaccard, James; Dittus, Patricia; Gonzalez, Bernardo; Bouris, Alida

    2008-01-01

    A framework for the analysis of adolescent problem behaviors was explicated that draws on five major theories of human behavior. The framework emphasizes intentions to perform behaviors and factors that influence intentions as well as moderate the impact of intentions on behavior. The framework was applied to the analysis of adolescent sexual risk…

  16. A framework for product analysis: Modelling and design of release and uptake of pesticides

    DEFF Research Database (Denmark)

    Muro Sunè, Nuria; Munir, Ahsan; Gani, Rafiqul

    2005-01-01

    This paper presents a framework for chemical product (pesticide) design and analysis. The framework consists of a set of computer-aided methods and tools that have been integrated to tackle the needs with respect to solution of chemical product design problems related to pesticide formulations. Two...... of the mathematical models (controlled release and pesticide uptake) that provide the principal calculation options are highlighted together with selected results from case studies....

  17. The Australian Health Informatics Competencies Framework and Its Role in the Certified Health Informatician Australasia (CHIA) Program.

    Science.gov (United States)

    Martin-Sanchez, Fernando; Rowlands, David; Schaper, Louise; Hansen, David

    2017-01-01

    The Certified Health Informatician Australasia (CHIA) program consists of an online exam, which aims to test whether a candidate has the knowledge and skills that are identified in the competencies framework to perform as a health informatics professional. The CHIA Health Informatics Competencies Framework provides the context in which the questions for the exam have been developed. The core competencies for health informatics that are tested in the exam have been developed with reference to similar programs by the American Medical Informatics Association, the International Medical Informatics Association and COACH, Canada's Health Informatics Association, and builds on the previous work done by the Australian Health Informatics Education Council. This paper shows how the development of this competency framework is helping to raise the profile of health informaticians in Australasia, contributing to a wider recognition of the profession, and defining more clearly the body of knowledge underpinning this discipline. This framework can also be used as a set of guidelines for recruiting purposes, definitions of career pathways, or the design of educational and training activities. We discuss here the current status of the program, its resultsandprospectsfor the future.

  18. Network analysis: An innovative framework for understanding eating disorder psychopathology.

    Science.gov (United States)

    Smith, Kathryn E; Crosby, Ross D; Wonderlich, Stephen A; Forbush, Kelsie T; Mason, Tyler B; Moessner, Markus

    2018-03-01

    Network theory and analysis is an emerging approach in psychopathology research that has received increasing attention across fields of study. In contrast to medical models or latent variable approaches, network theory suggests that psychiatric syndromes result from systems of causal and reciprocal symptom relationships. Despite the promise of this approach to elucidate key mechanisms contributing to the development and maintenance of eating disorders (EDs), thus far, few applications of network analysis have been tested in ED samples. We first present an overview of network theory, review the existing findings in the ED literature, and discuss the limitations of this literature to date. In particular, the reliance on cross-sectional designs, use of single-item self-reports of symptoms, and instability of results have raised concern about the inferences that can be made from network analyses. We outline several areas to address in future ED network analytic research, which include the use of prospective designs and adoption of multimodal assessment methods. Doing so will provide a clearer understanding of whether network analysis can enhance our current understanding of ED psychopathology and inform clinical interventions. © 2018 Wiley Periodicals, Inc.

  19. Ovis: A framework for visual analysis of ocean forecast ensembles

    KAUST Repository

    Hollt, Thomas

    2014-08-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea. © 1995-2012 IEEE.

  20. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  1. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  2. The new BNL partial wave analysis programs

    International Nuclear Information System (INIS)

    Cummings, J.P.; Weygand, D.P.

    1997-01-01

    Experiment E852 at Brookhaven National Laboratory is a meson spectroscopy experiment which took data at the Multi-Particle Spectrometer facility of the Alternating Gradient Syncrotron. Upgrades to the spectrometer's data acquisition and trigger electronics allowed over 900 million data events, of numerous topologies, to be recorded to tape in 1995 running alone. One of the primary goals of E852 is identification of states beyond the quark model, i.e., states with gluonic degrees of freedom. Identification of such states involves the measurement of a systems spin-parity. Such a measurement is usually done using Partial Wave Analysis. Programs to perform such analyses exist, in fact, one was written at BNL and used in previous experiments by some of this group. This program, however, was optimized for a particular final state, and modification to allow analysis of the broad range of final states in E852 would have been difficult. The authors therefore decided to write a new program, with an eye towards generality that would allow analysis of a large class of reactions

  3. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  4. Proposed framework for the Western Area Power Administration Environmental Risk Management Program

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, C.S.; DiMassa, F.V.; Pelto, P.J.; Brothers, A.J. [Pacific Northwest Lab., Richland, WA (United States); Roybal, A.L. [Western Area Power Administration, Golden, CO (United States)

    1994-12-01

    The Western Area Power Administration (Western) views environmental protection and compliance as a top priority as it manages the construction, operation, and maintenance of its vast network of transmission lines, substations, and other facilities. A recent Department of Energy audit of Western`s environmental management activities recommends that Western adopt a formal environmental risk program. To accomplish this goal, Western, in conjunction with Pacific Northwest Laboratory, is in the process of developing a centrally coordinated environmental risk program. This report presents the results of this design effort, and indicates the direction in which Western`s environmental risk program is heading. Western`s environmental risk program will consist of three main components: risk communication, risk assessment, and risk management/decision making. Risk communication is defined as an exchange of information on the potential for threats to human health, public safety, or the environment. This information exchange provides a mechanism for public involvement, and also for the participation in the risk assessment and management process by diverse groups or offices within Western. The objective of risk assessment is to evaluate and rank the relative magnitude of risks associated with specific environmental issues that are facing Western. The evaluation and ranking is based on the best available scientific information and judgment and serves as input to the risk management process. Risk management takes risk information and combines it with relevant non-risk factors (e.g., legal mandates, public opinion, costs) to generate risk management options. A risk management tool, such as decision analysis, can be used to help make risk management choices.

  5. Leveraging Data Analysis for Domain Experts: An Embeddable Framework for Basic Data Science Tasks

    Science.gov (United States)

    Lohrer, Johannes-Y.; Kaltenthaler, Daniel; Kröger, Peer

    2016-01-01

    In this paper, we describe a framework for data analysis that can be embedded into a base application. Since it is important to analyze the data directly inside the application where the data is entered, a tool that allows the scientists to easily work with their data, supports and motivates the execution of further analysis of their data, which…

  6. Hanford Site Composite Analysis Technical Approach Description: Integrated Computational Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, K. J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needs if potential problems are identified.

  7. 77 FR 18963 - Energy Conservation Program: Public Meeting and Availability of the Framework Document for High...

    Science.gov (United States)

    2012-03-29

    ... of the Framework Document for High-Intensity Discharge Lamps AGENCY: Office of Energy Efficiency and Renewable Energy, Department of Energy. ACTION: Extension of public comment period. SUMMARY: This document... Availability of Framework Document Regarding Energy Conservation Standards for High-Intensity Discharge (HID...

  8. A semidefinite programming based branch-and-bound framework for the quadratic assignment problem

    NARCIS (Netherlands)

    Truetsch, U.

    2014-01-01

    The practical approach to calculate an exact solution for a quadratic assignment problem (QAP) via a branch-and-bound framework depends strongly on a "smart" choice of different strategies within the framework, for example the branching strategy, heuristics for the upper bound or relaxations for the

  9. Using a Mixed-Methods RE-AIM Framework to Evaluate Community Health Programs for Older Latinas.

    Science.gov (United States)

    Schwingel, Andiara; Gálvez, Patricia; Linares, Deborah; Sebastião, Emerson

    2017-06-01

    This study used the RE-AIM (Reach, Effectiveness, Adoption, Implementation, and Maintenance) framework to evaluate a promotora-led community health program designed for Latinas ages 50 and older that sought to improve physical activity, nutrition, and stress management. A mixed-methods evaluation approach was administered at participant and organizational levels with a focus on the efficacy, adoption, implementation, and maintenance components of the RE-AIM theoretical model. The program was shown to be effective at improving participants' eating behaviors, increasing their physical activity levels, and lowering their depressive symptoms. Promotoras felt motivated and sufficiently prepared to deliver the program. Some implementation challenges were reported. More child care opportunities and an increased focus on mental well-being were suggested. The promotora delivery model has promise for program sustainability with both promotoras and participants alike expressing interest in leading future programs.

  10. Environmental Education Organizations and Programs in Texas: Identifying Patterns through a Database and Survey Approach for Establishing Frameworks for Assessment and Progress

    Science.gov (United States)

    Lloyd-Strovas, Jenny D.; Arsuffi, Thomas L.

    2016-01-01

    We examined the diversity of environmental education (EE) in Texas, USA, by developing a framework to assess EE organizations and programs at a large scale: the Environmental Education Database of Organizations and Programs (EEDOP). This framework consisted of the following characteristics: organization/visitor demographics, pedagogy/curriculum,…

  11. Programmed Effects in Neurobehavior and Antioxidative Physiology in Zebrafish Embryonically Exposed to Cadmium: Observations and Hypothesized Adverse Outcome Pathway Framework

    Directory of Open Access Journals (Sweden)

    Sander Ruiter

    2016-11-01

    Full Text Available Non-communicable diseases (NCDs are a major cause of premature mortality. Recent studies show that predispositions for NCDs may arise from early-life exposure to low concentrations of environmental contaminants. This developmental origins of health and disease (DOHaD paradigm suggests that programming of an embryo can be disrupted, changing the homeostatic set point of biological functions. Epigenetic alterations are a possible underlying mechanism. Here, we investigated the DOHaD paradigm by exposing zebrafish to subtoxic concentrations of the ubiquitous contaminant cadmium during embryogenesis, followed by growth under normal conditions. Prolonged behavioral responses to physical stress and altered antioxidative physiology were observed approximately ten weeks after termination of embryonal exposure, at concentrations that were 50–3200-fold below the direct embryotoxic concentration, and interpreted as altered developmental programming. Literature was explored for possible mechanistic pathways that link embryonic subtoxic cadmium to the observed apical phenotypes, more specifically, the probability of molecular mechanisms induced by cadmium exposure leading to altered DNA methylation and subsequently to the observed apical phenotypes. This was done using the adverse outcome pathway model framework, and assessing key event relationship plausibility by tailored Bradford-Hill analysis. Thus, cadmium interaction with thiols appeared to be the major contributor to late-life effects. Cadmium-thiol interactions may lead to depletion of the methyl donor S-adenosyl-methionine, resulting in methylome alterations, and may, additionally, result in oxidative stress, which may lead to DNA oxidation, and subsequently altered DNA methyltransferase activity. In this way, DNA methylation may be affected at a critical developmental stage, causing the observed apical phenotypes.

  12. A Framework for RFID Survivability Requirement Analysis and Specification

    Science.gov (United States)

    Zuo, Yanjun; Pimple, Malvika; Lande, Suhas

    Many industries are becoming dependent on Radio Frequency Identification (RFID) technology for inventory management and asset tracking. The data collected about tagged objects though RFID is used in various high level business operations. The RFID system should hence be highly available, reliable, and dependable and secure. In addition, this system should be able to resist attacks and perform recovery in case of security incidents. Together these requirements give rise to the notion of a survivable RFID system. The main goal of this paper is to analyze and specify the requirements for an RFID system to become survivable. These requirements, if utilized, can assist the system in resisting against devastating attacks and recovering quickly from damages. This paper proposes the techniques and approaches for RFID survivability requirements analysis and specification. From the perspective of system acquisition and engineering, survivability requirement is the important first step in survivability specification, compliance formulation, and proof verification.

  13. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  14. A Generalized Framework for Non-Stationary Extreme Value Analysis

    Science.gov (United States)

    Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.

    2017-12-01

    Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA

  15. Big data analysis framework for healthcare and social sectors in Korea.

    Science.gov (United States)

    Song, Tae-Min; Ryu, Seewon

    2015-01-01

    We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached.

  16. Hybrid Information Flow Analysis for Programs with Arrays

    Directory of Open Access Journals (Sweden)

    Gergö Barany

    2016-07-01

    Full Text Available Information flow analysis checks whether certain pieces of (confidential data may affect the results of computations in unwanted ways and thus leak information. Dynamic information flow analysis adds instrumentation code to the target software to track flows at run time and raise alarms if a flow policy is violated; hybrid analyses combine this with preliminary static analysis. Using a subset of C as the target language, we extend previous work on hybrid information flow analysis that handled pointers to scalars. Our extended formulation handles arrays, pointers to array elements, and pointer arithmetic. Information flow through arrays of pointers is tracked precisely while arrays of non-pointer types are summarized efficiently. A prototype of our approach is implemented using the Frama-C program analysis and transformation framework. Work on a full machine-checked proof of the correctness of our approach using Isabelle/HOL is well underway; we present the existing parts and sketch the rest of the correctness argument.

  17. Energy Analysis Program. 1992 Annual report

    Energy Technology Data Exchange (ETDEWEB)

    1993-06-01

    The Program became deeply involved in establishing 4 Washington, D.C., project office diving the last few months of fiscal year 1942. This project office, which reports to the Energy & Environment Division, will receive the majority of its support from the Energy Analysis Program. We anticipate having two staff scientists and support personnel in offices within a few blocks of DOE. Our expectation is that this office will carry out a series of projects that are better managed closer to DOE. We also anticipate that our representation in Washington will improve and we hope to expand the Program, its activities, and impact, in police-relevant analyses. In spite of the growth that we have achieved, the Program continues to emphasize (1) energy efficiency of buildings, (2) appliance energy efficiency standards, (3) energy demand forecasting, (4) utility policy studies, especially integrated resource planning issues, and (5) international energy studies, with considerate emphasis on developing countries and economies in transition. These continuing interests are reflected in the articles that appear in this report.

  18. Insight into dynamic genome imaging: Canonical framework identification and high-throughput analysis.

    Science.gov (United States)

    Ronquist, Scott; Meixner, Walter; Rajapakse, Indika; Snyder, John

    2017-07-01

    The human genome is dynamic in structure, complicating researcher's attempts at fully understanding it. Time series "Fluorescent in situ Hybridization" (FISH) imaging has increased our ability to observe genome structure, but due to cell type and experimental variability this data is often noisy and difficult to analyze. Furthermore, computational analysis techniques are needed for homolog discrimination and canonical framework detection, in the case of time-series images. In this paper we introduce novel ideas for nucleus imaging analysis, present findings extracted using dynamic genome imaging, and propose an objective algorithm for high-throughput, time-series FISH imaging. While a canonical framework could not be detected beyond statistical significance in the analyzed dataset, a mathematical framework for detection has been outlined with extension to 3D image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. A statistical framework for differential network analysis from microarray data

    Directory of Open Access Journals (Sweden)

    Datta Somnath

    2010-02-01

    Full Text Available Abstract Background It has been long well known that genes do not act alone; rather groups of genes act in consort during a biological process. Consequently, the expression levels of genes are dependent on each other. Experimental techniques to detect such interacting pairs of genes have been in place for quite some time. With the advent of microarray technology, newer computational techniques to detect such interaction or association between gene expressions are being proposed which lead to an association network. While most microarray analyses look for genes that are differentially expressed, it is of potentially greater significance to identify how entire association network structures change between two or more biological settings, say normal versus diseased cell types. Results We provide a recipe for conducting a differential analysis of networks constructed from microarray data under two experimental settings. At the core of our approach lies a connectivity score that represents the strength of genetic association or interaction between two genes. We use this score to propose formal statistical tests for each of following queries: (i whether the overall modular structures of the two networks are different, (ii whether the connectivity of a particular set of "interesting genes" has changed between the two networks, and (iii whether the connectivity of a given single gene has changed between the two networks. A number of examples of this score is provided. We carried out our method on two types of simulated data: Gaussian networks and networks based on differential equations. We show that, for appropriate choices of the connectivity scores and tuning parameters, our method works well on simulated data. We also analyze a real data set involving normal versus heavy mice and identify an interesting set of genes that may play key roles in obesity. Conclusions Examining changes in network structure can provide valuable information about the

  20. Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework

    Science.gov (United States)

    Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.

    2017-12-01

    The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.

  1. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    Science.gov (United States)

    Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N; Meng, Fande

    2015-01-01

    Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program-FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program efficiency

  2. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    Directory of Open Access Journals (Sweden)

    Runzhe Geng

    Full Text Available Best management practices (BMPs for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P index, model simulation techniques (Hydrological Simulation Program-FORTRAN, and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001 decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program

  3. SAFE: A Sentiment Analysis Framework for E-Learning

    Directory of Open Access Journals (Sweden)

    Francesco Colace

    2014-12-01

    Full Text Available The spread of social networks allows sharing opinions on different aspects of life and daily millions of messages appear on the web. This textual information can be a rich source of data for opinion mining and sentiment analysis: the computational study of opinions, sentiments and emotions expressed in a text. Its main aim is the identification of the agreement or disagreement statements that deal with positive or negative feelings in comments or reviews. In this paper, we investigate the adoption, in the field of the e-learning, of a probabilistic approach based on the Latent Dirichlet Allocation (LDA as Sentiment grabber. By this approach, for a set of documents belonging to a same knowledge domain, a graph, the Mixed Graph of Terms, can be automatically extracted. The paper shows how this graph contains a set of weighted word pairs, which are discriminative for sentiment classification. In this way, the system can detect the feeling of students on some topics and teacher can better tune his/her teaching approach. In fact, the proposed method has been tested on datasets coming from e-learning platforms. A preliminary experimental campaign shows how the proposed approach is effective and satisfactory.

  4. A Web Service Framework for Interactive Analysis of Metabolomics Data.

    Science.gov (United States)

    Lyutvinskiy, Yaroslav; Watrous, Jeramie D; Jain, Mohit; Nilsson, Roland

    2017-06-06

    Analyzing mass spectrometry-based metabolomics data presents a major challenge to metabolism researchers, as it requires downloading and processing large data volumes through complex "pipelines", even in cases where only a single metabolite or peak is of interest. This presents a significant hurdle for data sharing, reanalysis, or meta-analysis of existing data sets, whether locally stored or available from public repositories. Here we introduce mzAccess, a software system that provides interactive, online access to primary mass spectrometry data in real-time via a Web service protocol, circumventing the need for bulk data processing. mzAccess allows querying instrument data for spectra, chromatograms, or two-dimensional MZ-RT areas in either profile or centroid modes through a simple, uniform interface that is independent of vendor or instrument type. Using a cache mechanism, mzAccess achieves response times in the millisecond range for typical liquid chromatography-mass spectrometry (LC-MS) peaks, enabling real-time browsing of large data sets with hundreds or even thousands of samples. By simplifying access to metabolite data, we hope that this system will help enable data sharing and reanalysis in the metabolomics field.

  5. Sediment Analysis Using a Structured Programming Approach

    Directory of Open Access Journals (Sweden)

    Daniela Arias-Madrid

    2012-12-01

    Full Text Available This paper presents an algorithm designed for the analysis of a sedimentary sample of unconsolidated material and seeks to identify very quickly the main features that occur in a sediment and thus classify them fast and efficiently. For this purpose, it requires that the weight of each particle size to be entered in the program and using the method of Moments, which is based on four equations representing the mean, standard deviation, skewness and kurtosis, is found the attributes of the sample in few seconds. With the program these calculations are performed in an effective and more accurately way, obtaining also the explanations of the results of the features such as grain size, sorting, symmetry and origin, which helps to improve the study of sediments and in general the study of sedimentary rocks.

  6. A 3-month jump-landing training program: a feasibility study using the RE-AIM framework.

    Science.gov (United States)

    Aerts, Inne; Cumps, Elke; Verhagen, Evert; Mathieu, Niels; Van Schuerbeeck, Sander; Meeusen, Romain

    2013-01-01

    Evaluating the translatability and feasibility of an intervention program has become as important as determining the effectiveness of the intervention. To evaluate the applicability of a 3-month jump-landing training program in basketball players, using the RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) framework. Randomized controlled trial. National and regional basketball teams. Twenty-four teams of the second highest national division and regional basketball divisions in Flanders, Belgium, were randomly assigned (1:1) to a control group and intervention group. A total of 243 athletes (control group = 129, intervention group = 114), ages 15 to 41 years, volunteered. All exercises in the intervention program followed a progressive development, emphasizing lower extremity alignment during jump-landing activities. The results of the process evaluation of the intervention program were based on the 5 dimensions of the RE-AIM framework. The injury incidence density, hazard ratios, and 95% confidence intervals were determined. The participation rate of the total sample was 100% (reach). The hazard ratio was different between the intervention group and the control group (0.40 [95% confidence interval = 0.16, 0.99]; effectiveness). Of the 12 teams in the intervention group, 8 teams (66.7%) agreed to participate in the study (adoption). Eight of the participating coaches (66.7%) felt positively about the intervention program and stated that they had implemented the training sessions of the program as intended (implementation). All coaches except 1 (87.5%) intended to continue the intervention program the next season (maintenance). Compliance of the coaches in this coach-supervised jump-landing training program was high. In addition, the program was effective in preventing lower extremity injuries.

  7. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  8. Science and scientists in regulatory governance: a mezzo-level framework for analysis

    OpenAIRE

    G Bruce Doern; Ted Reed

    2001-01-01

    The article argues that the study of science in government needs a viable mezzo- or middle-level framework to deal adequately with the analysis of science in regulatory governance and then advances a possible framework. The case for mezzo-level analysis is developed through a brief review of relevant literature on science in government policy and regulatory decisionmaking, which is basically on macro and micro ‘science in government’ relationships, and tends to neglect the ‘mezzo-ham’ in the ...

  9. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...

  10. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). System Sustainment & Readiness Technologies Dept.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineering system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.

  11. From fatalism to mitigation: a conceptual framework for mitigating fetal programming of chronic disease by maternal obesity

    OpenAIRE

    Boone-Heinonen, Janne; Messer, Lynne C.; Fortmann, Stephen P.; Wallack, Lawrence; Thornburg, Kent L.

    2015-01-01

    Prenatal development is recognized as a critical period in the etiology of obesity and cardiometabolic disease. Potential strategies to reduce maternal obesity-induced risk later in life have been largely overlooked. In this paper, we first propose a conceptual framework for the role of public health and preventive medicine in mitigating the effects of fetal programming. Second, we review a small but growing body of research (through August 2015) that examines interactive effects of maternal ...

  12. A metric and frameworks for resilience analysis of engineered and infrastructure systems

    International Nuclear Information System (INIS)

    Francis, Royce; Bekera, Behailu

    2014-01-01

    In this paper, we have reviewed various approaches to defining resilience and the assessment of resilience. We have seen that while resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. In this paper, we have proposed a resilience analysis framework and a metric for measuring resilience. Our analysis framework consists of system identification, resilience objective setting, vulnerability analysis, and stakeholder engagement. The implementation of this framework is focused on the achievement of three resilience capacities: adaptive capacity, absorptive capacity, and recoverability. These three capacities also form the basis of our proposed resilience factor and uncertainty-weighted resilience metric. We have also identified two important unresolved discussions emerging in the literature: the idea of resilience as an epistemological versus inherent property of the system, and design for ecological versus engineered resilience in socio-technical systems. While we have not resolved this tension, we have shown that our framework and metric promote the development of methodologies for investigating “deep” uncertainties in resilience assessment while retaining the use of probability for expressing uncertainties about highly uncertain, unforeseeable, or unknowable hazards in design and management activities. - Highlights: • While resilience is a useful concept, its diversity in usage complicates its interpretation and measurement. • We proposed a resilience analysis framework whose implementation is encapsulated within resilience metric incorporating absorptive, adaptive, and restorative capacities. • We have shown that our framework and metric can support the investigation of “deep” uncertainties in resilience assessment or analysis. • We have discussed the role of quantitative metrics in design for ecological versus engineered resilience in socio-technical systems. • Our resilience metric supports

  13. Using the Knowledge to Action Framework in practice: a citation analysis and systematic review.

    Science.gov (United States)

    Field, Becky; Booth, Andrew; Ilott, Irene; Gerrish, Kate

    2014-11-23

    Conceptual frameworks are recommended as a way of applying theory to enhance implementation efforts. The Knowledge to Action (KTA) Framework was developed in Canada by Graham and colleagues in the 2000s, following a review of 31 planned action theories. The framework has two components: Knowledge Creation and an Action Cycle, each of which comprises multiple phases. This review sought to answer two questions: 'Is the KTA Framework used in practice? And if so, how?' This study is a citation analysis and systematic review. The index citation for the original paper was identified on three databases-Web of Science, Scopus and Google Scholar-with the facility for citation searching. Limitations of English language and year of publication 2006-June 2013 were set. A taxonomy categorising the continuum of usage was developed. Only studies applying the framework to implementation projects were included. Data were extracted and mapped against each phase of the framework for studies where it was integral to the implementation project. The citation search yielded 1,787 records. A total of 1,057 titles and abstracts were screened. One hundred and forty-six studies described usage to varying degrees, ranging from referenced to integrated. In ten studies, the KTA Framework was integral to the design, delivery and evaluation of the implementation activities. All ten described using the Action Cycle and seven referred to Knowledge Creation. The KTA Framework was enacted in different health care and academic settings with projects targeted at patients, the public, and nursing and allied health professionals. The KTA Framework is being used in practice with varying degrees of completeness. It is frequently cited, with usage ranging from simple attribution via a reference, through informing planning, to making an intellectual contribution. When the framework was integral to knowledge translation, it guided action in idiosyncratic ways and there was theory fidelity. Prevailing wisdom

  14. Vibration Analysis of a Framework Structure by Generalized Transfer Stiffness Coefficient Method

    Science.gov (United States)

    Bonkobara, Yasuhiro; Kondou, Takahiro; Ayabe, Takashi; Choi, Myung-Soo

    A generalized transfer stiffness coefficient method using graph theory is developed in order to improve the applicability of the transfer stiffness coefficient method. In the new method, an analytical model is expressed by a weighted signal-flow graph, and the graph is contracted according to the series and parallel contraction rules. The computational complexity and the memory requirement for the contraction process are both minimized by choosing the optimal contraction route. In addition, it is possible to develop a data-driving program that is applicable to various structures without updating the source program. An algorithm based on the present method is formulated for the in-plane longitudinal and flexural coupled free and forced vibration analyses of a two-dimensional framework structure. Furthermore, an overview for applying the method to a three-dimensional framework structure is briefly presented. The validity of the present algorithm is confirmed by the results of numerical computations.

  15. Disease Management, Case Management, Care Management, and Care Coordination: A Framework and a Brief Manual for Care Programs and Staff.

    Science.gov (United States)

    Ahmed, Osman I

    2016-01-01

    With the changing landscape of health care delivery in the United States since the passage of the Patient Protection and Affordable Care Act in 2010, health care organizations have struggled to keep pace with the evolving paradigm, particularly as it pertains to population health management. New nomenclature emerged to describe components of the new environment, and familiar words were put to use in an entirely different context. This article proposes a working framework for activities performed in case management, disease management, care management, and care coordination. The author offers standard working definitions for some of the most frequently used words in the health care industry with the goal of increasing consistency for their use, especially in the backdrop of the Centers for Medicaid & Medicare Services offering a "chronic case management fee" to primary care providers for managing the sickest, high-cost Medicare patients. Health care organizations performing case management, care management, disease management, and care coordination. Road map for consistency among users, in reporting, comparison, and for success of care management/coordination programs. This article offers a working framework for disease managers, case and care managers, and care coordinators. It suggests standard definitions to use for disease management, case management, care management, and care coordination. Moreover, the use of clear terminology will facilitate comparing, contrasting, and evaluating all care programs and increase consistency. The article can improve understanding of care program components and success factors, estimate program value and effectiveness, heighten awareness of consumer engagement tools, recognize current state and challenges for care programs, understand the role of health information technology solutions in care programs, and use information and knowledge gained to assess and improve care programs to design the "next generation" of programs.

  16. A Study of Prescriptive Analysis Framework for Human Care Services Based On CKAN Cloud

    Directory of Open Access Journals (Sweden)

    Jangwon Gim

    2018-01-01

    Full Text Available A number of sensor devices are widely distributed and used today owing to the accelerated development of IoT technology. In particular, this technological advancement has allowed users to carry IoT devices with more convenience and efficiency. Based on the IoT sensor data, studies are being actively carried out to recognize the current situation or to analyze and predict future events. However, research for existing smart healthcare services is focused on analyzing users’ behavior from single sensor data and is also focused on analyzing and diagnosing the current situation of the users. Therefore, a method for effectively managing and integrating a large amount of IoT sensor data has become necessary, and a framework considering data interoperability has become necessary. In addition, an analysis framework is needed not only to provide the analysis of the users’ environment and situation from the integrated data, but also to provide guide information to predict future events and to take appropriate action by users. In this paper, we propose a prescriptive analysis framework using a 5W1H method based on CKAN cloud. Through the CKAN cloud environment, IoT sensor data stored in individual CKANs can be integrated based on common concepts. As a result, it is possible to generate an integrated knowledge graph considering interoperability of data, and the underlying data is used as the base data for prescriptive analysis. In addition, the proposed prescriptive analysis framework can diagnose the situation of the users through analysis of user environment information and supports users’ decision making by recommending the possible behavior according to the coming situation of the users. We have verified the applicability of the 5W1H prescriptive analysis framework based on the use case of collecting and analyzing data obtained from various IoT sensors.

  17. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  18. Analysis of System-Wide Investment in the National Airspace System: A Portfolio Analytical Framework and an Example

    Science.gov (United States)

    Bhadra, Dipasis; Morser, Frederick R.

    2006-01-01

    In this paper, the authors review the FAA s current program investments and lay out a preliminary analytical framework to undertake projects that may address some of the noted deficiencies. By drawing upon the well developed theories from corporate finance, an analytical framework is offered that can be used for choosing FAA s investments taking into account risk, expected returns and inherent dependencies across NAS programs. The framework can be expanded into taking multiple assets and realistic values for parameters in drawing an efficient risk-return frontier for the entire FAA investment programs.

  19. Using the Communities of Practice Framework to Examine an After-School Environmental Education Program for Hispanic Youth

    Science.gov (United States)

    Aguilar, Olivia M.; Krasny, Marianne E.

    2011-01-01

    Environmental education researchers have called for a greater analysis of "learning" in environmental education in relation to contemporary theories and explanatory frameworks of learning. Situated learning, as a prominent example, is a sociocultural theory that contends that learning is a social process that occurs as individuals…

  20. Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.

    Science.gov (United States)

    Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong

    2016-05-01

    This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.

  1. CRBLASTER: A Parallel-Processing Computational Framework for Embarrassingly Parallel Image-Analysis Algorithms

    Science.gov (United States)

    Mighell, Kenneth John

    2010-10-01

    The development of parallel-processing image-analysis codes is generally a challenging task that requires complicated choreography of interprocessor communications. If, however, the image-analysis algorithm is embarrassingly parallel, then the development of a parallel-processing implementation of that algorithm can be a much easier task to accomplish because, by definition, there is little need for communication between the compute processes. I describe the design, implementation, and performance of a parallel-processing image-analysis application, called crblaster, which does cosmic-ray rejection of CCD images using the embarrassingly parallel l.a.cosmic algorithm. crblaster is written in C using the high-performance computing industry standard Message Passing Interface (MPI) library. crblaster uses a two-dimensional image partitioning algorithm that partitions an input image into N rectangular subimages of nearly equal area; the subimages include sufficient additional pixels along common image partition edges such that the need for communication between computer processes is eliminated. The code has been designed to be used by research scientists who are familiar with C as a parallel-processing computational framework that enables the easy development of parallel-processing image-analysis programs based on embarrassingly parallel algorithms. The crblaster source code is freely available at the official application Web site at the National Optical Astronomy Observatory. Removing cosmic rays from a single 800 × 800 pixel Hubble Space Telescope WFPC2 image takes 44 s with the IRAF script lacos_im.cl running on a single core of an Apple Mac Pro computer with two 2.8 GHz quad-core Intel Xeon processors. crblaster is 7.4 times faster when processing the same image on a single core on the same machine. Processing the same image with crblaster simultaneously on all eight cores of the same machine takes 0.875 s—which is a speedup factor of 50.3 times faster than the

  2. IKOS: A Framework for Static Analysis based on Abstract Interpretation (Tool Paper)

    Science.gov (United States)

    Brat, Guillaume P.; Laserna, Jorge A.; Shi, Nija; Venet, Arnaud Jean

    2014-01-01

    The RTCA standard (DO-178C) for developing avionic software and getting certification credits includes an extension (DO-333) that describes how developers can use static analysis in certification. In this paper, we give an overview of the IKOS static analysis framework that helps developing static analyses that are both precise and scalable. IKOS harnesses the power of Abstract Interpretation and makes it accessible to a larger class of static analysis developers by separating concerns such as code parsing, model development, abstract domain management, results management, and analysis strategy. The benefits of the approach is demonstrated by a buffer overflow analysis applied to flight control systems.

  3. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F.; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  4. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  5. Health systems strengthening: a common classification and framework for investment analysis.

    Science.gov (United States)

    Shakarishvili, George; Lansang, Mary Ann; Mitta, Vinod; Bornemisza, Olga; Blakley, Matthew; Kley, Nicole; Burgess, Craig; Atun, Rifat

    2011-07-01

    Significant scale-up of donors' investments in health systems strengthening (HSS), and the increased application of harmonization mechanisms for jointly channelling donor resources in countries, necessitate the development of a common framework for tracking donors' HSS expenditures. Such a framework would make it possible to comparatively analyse donors' contributions to strengthening specific aspects of countries' health systems in multi-donor-supported HSS environments. Four pre-requisite factors are required for developing such a framework: (i) harmonization of conceptual and operational understanding of what constitutes HSS; (ii) development of a common set of criteria to define health expenditures as contributors to HSS; (iii) development of a common HSS classification system; and (iv) harmonization of HSS programmatic and financial data to allow for inter-agency comparative analyses. Building on the analysis of these aspects, the paper proposes a framework for tracking donors' investments in HSS, as a departure point for further discussions aimed at developing a commonly agreed approach. Comparative analysis of financial allocations by the Global Fund to Fight AIDS, Tuberculosis and Malaria and the GAVI Alliance for HSS, as an illustrative example of applying the proposed framework in practice, is also presented.

  6. Parsing Schemata - a framework for specification and analysis of parsing algorithms

    NARCIS (Netherlands)

    Sikkel, Nicolaas

    1997-01-01

    Parsing schemata provide a general framework for specification, analysis and comparison of (sequential and/or parallel) parsing algorithms. A grammar specifies implicitly what the valid parses of a sentence are; a parsing algorithm specifies explicitly how to compute these. Parsing schemata form a

  7. A FRAMEWORK FOR DOCUMENT PRE-PROCESSING IN FORENSIC HANDWRITING ANALYSIS

    NARCIS (Netherlands)

    Franke, K.; Köppen, M.

    2004-01-01

    We propose an open layered framework, which might be adapted to fulfill sophisticated demands in forensic handwriting analysis. Due to the contradicting requirements of processing a huge amount of different document types as well as providing high quality processed images of singular document

  8. Complexity and Intensionality in a Type-1 Framework for Computable Analysis

    DEFF Research Database (Denmark)

    Lambov, Branimir Zdravkov

    2005-01-01

    This paper describes a type-1 framework for computable analysis designed to facilitate efficient implementations and discusses properties that have not been well studied before for type-1 approaches: the introduction of complexity measures for type-1 representations of real functions, and ways to...... of machine precision floating point....

  9. State Civic Education Policy: Framework and Gap Analysis Tool. Special Report

    Science.gov (United States)

    Baumann, Paul; Brennan, Jan

    2017-01-01

    The civic education policy framework and gap analysis tool are intended to guide state leaders as they address the complexities of preparing students for college, career and civic life. They allow for adaptation to state- and site-specific circumstances and may be adopted in whole or in piecemeal fashion, according to states' individual…

  10. The SAFE FOODS Risk Analysis Framework suitable for GMOs? A case study

    NARCIS (Netherlands)

    Kuiper, H.A.; Davies, H.V.

    2010-01-01

    This paper describes the current EU regulatory framework for risk analysis of genetically modified (GM) crop cultivation and market introduction of derived food/feed. Furthermore the risk assessment strategies for GM crops and derived food/feed as designed by the European Food Safety Authority

  11. Understanding Universities in Ontario, Canada: An Industry Analysis Using Porter's Five Forces Framework

    Science.gov (United States)

    Pringle, James; Huisman, Jeroen

    2011-01-01

    In analyses of higher education systems, many models and frameworks are based on governance, steering, or coordination models. Although much can be gained by such analyses, we argue that the language used in the present-day policy documents (knowledge economy, competitive position, etc.) calls for an analysis of higher education as an industry. In…

  12. Analysis of shape isomer yields of 237 Pu in the framework of ...

    Indian Academy of Sciences (India)

    Data on shape isomer yield for + 235U reaction at E lab = 20–29 MeV are analysed in the framework of a combined dynamical–statistical model. From this analysis, information on the double humped fission barrier parameters for some Pu isotopes has been obtained and it is shown that the depth of the second potential ...

  13. Examining Differential Item Functioning: IRT-Based Detection in the Framework of Confirmatory Factor Analysis

    Science.gov (United States)

    Dimitrov, Dimiter M.

    2017-01-01

    This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.

  14. An irerative requirements engineering framework based on formal concept analysis and C-K theory

    NARCIS (Netherlands)

    Poelmans, J.; Dedene, G.; Snoeck, M.; Viaene, S.

    2012-01-01

    In this paper, we propose an expert system for iterative requirements engineering using Formal Concept Analysis. The requirements engineering approach is grounded in the theoretical framework of C-K theory. An essential result of this approach is that we obtain normalized class models. Compared to

  15. Mississippi Curriculum Framework for Automotive Mechanics (Program CIP: 47.0604--Auto/Automotive Mechanic/Tech). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for automotive mechanics I and II. Presented first are a program description…

  16. Mississippi Curriculum Framework for Horticulture (Program CIP: 01.0601--Horticulture Serv. Op. & Mgmt., Gen.). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for horticulture I and II. Presented first are a program description and…

  17. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer....... The knowledge base provides the necessary information/data during the design of the PAT system while the model library generates additional or missing data needed for design and analysis. Optimization of the PAT system design is achieved in terms of product data analysis time and/or cost of monitoring equipment......-aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...

  18. A framework for the economic analysis of data collection methods for vital statistics.

    Science.gov (United States)

    Jimenez-Soto, Eliana; Hodge, Andrew; Nguyen, Kim-Huong; Dettrick, Zoe; Lopez, Alan D

    2014-01-01

    Over recent years there has been a strong movement towards the improvement of vital statistics and other types of health data that inform evidence-based policies. Collecting such data is not cost free. To date there is no systematic framework to guide investment decisions on methods of data collection for vital statistics or health information in general. We developed a framework to systematically assess the comparative costs and outcomes/benefits of the various data methods for collecting vital statistics. The proposed framework is four-pronged and utilises two major economic approaches to systematically assess the available data collection methods: cost-effectiveness analysis and efficiency analysis. We built a stylised example of a hypothetical low-income country to perform a simulation exercise in order to illustrate an application of the framework. Using simulated data, the results from the stylised example show that the rankings of the data collection methods are not affected by the use of either cost-effectiveness or efficiency analysis. However, the rankings are affected by how quantities are measured. There have been several calls for global improvements in collecting useable data, including vital statistics, from health information systems to inform public health policies. Ours is the first study that proposes a systematic framework to assist countries undertake an economic evaluation of DCMs. Despite numerous challenges, we demonstrate that a systematic assessment of outputs and costs of DCMs is not only necessary, but also feasible. The proposed framework is general enough to be easily extended to other areas of health information.

  19. Along the way to developing a theory of the program: a re-examination of the conceptual framework as an organizing strategy.

    Science.gov (United States)

    Helitzer, Deborah L; Sussman, Andrew L; Hoffman, Richard M; Getrich, Christina M; Warner, Teddy D; Rhyne, Robert L

    2014-08-01

    Conceptual frameworks (CF) have historically been used to develop program theory. We re-examine the literature about the role of CF in this context, specifically how they can be used to create descriptive and prescriptive theories, as building blocks for a program theory. Using a case example of colorectal cancer screening intervention development, we describe the process of developing our initial CF, the methods used to explore the constructs in the framework and revise the framework for intervention development. We present seven steps that guided the development of our CF: (1) assemble the "right" research team, (2) incorporate existing literature into the emerging CF, (3) construct the conceptual framework, (4) diagram the framework, (5) operationalize the framework: develop the research design and measures, (6) conduct the research, and (7) revise the framework. A revised conceptual framework depicted more complicated inter-relationships of the different predisposing, enabling, reinforcing, and system-based factors. The updated framework led us to generate program theory and serves as the basis for designing future intervention studies and outcome evaluations. A CF can build a foundation for program theory. We provide a set of concrete steps and lessons learned to assist practitioners in developing a CF. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Understanding decentralized forest governance: an application of the institutional analysis and development framework

    OpenAIRE

    Krister Andersson

    2006-01-01

    This paper analyzes how local institutional arrangements shape outcomes in the increasingly decentralized policy regimes of the non-industrialized world. The goal is to evaluate local institutional strategies associated with effective forest governance. I use the Institutional Analysis and Development (IAD) framework to study the institutional conditions conducive to effective decentralized forest governance and how these relate to sustainability. The IAD-guided analysis allows me to formulat...

  1. An Integrated Strategy Framework (ISF) for Combining Porter's 5-Forces, Diamond, PESTEL, and SWOT Analysis

    OpenAIRE

    Anton, Roman

    2015-01-01

    INTRODUCTION Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy framework (ISF) combines all major concepts. PURPOSE Porter's Five-Forces, Porter's Diamond, PESTEL, the 6th-Forths, and Humphrey's SWOT analysis are among the most important and popular concepts taught in business schools around the world. A new integrated strategy fr...

  2. Development of a CAD Model Simplification Framework for Finite Element Analysis

    Science.gov (United States)

    2012-01-01

    homotopy preserving medial axis transform by Sud et al. [38], and an extension on the common MAT method known as Θ-MAT for poly- hedral mesh models by...enhancement, ergonomic aspect, or part attach- ment method ; can facilitate the classification of the part’s criticality for analysis . The PART NAME and...ABSTRACT Title of thesis: DEVELOPMENT OF A CAD MODEL SIMPLIFICATION FRAMEWORK FOR FINITE ELEMENT ANALYSIS Brian Henry Russ, Master of Science, 2012

  3. ROOT: A C++ framework for petabyte data storage, statistical analysis and visualization

    Energy Technology Data Exchange (ETDEWEB)

    Antcheva, I.; /CERN; Ballintijn, M.; /CERN; Bellenot, B.; /CERN; Biskup, M.; /CERN; Brun, R.; /CERN; Buncic, N.; /CERN; Canal, Ph.; /Fermilab; Casadei, D.; /New York U.; Couet, O.; /CERN; Fine, V.; /Brookhaven; Franco, L.; /CERN /CERN

    2009-01-01

    ROOT is an object-oriented C++ framework conceived in the high-energy physics (HEP) community, designed for storing and analyzing petabytes of data in an efficient way. Any instance of a C++ class can be stored into a ROOT file in a machine-independent compressed binary format. In ROOT the TTree object container is optimized for statistical data analysis over very large data sets by using vertical data storage techniques. These containers can span a large number of files on local disks, the web or a number of different shared file systems. In order to analyze this data, the user can chose out of a wide set of mathematical and statistical functions, including linear algebra classes, numerical algorithms such as integration and minimization, and various methods for performing regression analysis (fitting). In particular, the RooFit package allows the user to perform complex data modeling and fitting while the RooStats library provides abstractions and implementations for advanced statistical tools. Multivariate classification methods based on machine learning techniques are available via the TMVA package. A central piece in these analysis tools are the histogram classes which provide binning of one- and multi-dimensional data. Results can be saved in high-quality graphical formats like Postscript and PDF or in bitmap formats like JPG or GIF. The result can also be stored into ROOT macros that allow a full recreation and rework of the graphics. Users typically create their analysis macros step by step, making use of the interactive C++ interpreter CINT, while running over small data samples. Once the development is finished, they can run these macros at full compiled speed over large data sets, using on-the-fly compilation, or by creating a stand-alone batch program. Finally, if processing farms are available, the user can reduce the execution time of intrinsically parallel tasks - e.g. data mining in HEP - by using PROOF, which will take care of optimally

  4. A human-centered framework for innovation in conservation incentive programs.

    Science.gov (United States)

    Sorice, Michael G; Donlan, C Josh

    2015-12-01

    The promise of environmental conservation incentive programs that provide direct payments in exchange for conservation outcomes is that they enhance the value of engaging in stewardship behaviors. An insidious but important concern is that a narrow focus on optimizing payment levels can ultimately suppress program participation and subvert participants' internal motivation to engage in long-term conservation behaviors. Increasing participation and engendering stewardship can be achieved by recognizing that participation is not simply a function of the payment; it is a function of the overall structure and administration of the program. Key to creating innovative and more sustainable programs is fitting them within the existing needs and values of target participants. By focusing on empathy for participants, co-designing program approaches, and learning from the rapid prototyping of program concepts, a human-centered approach to conservation incentive program design enhances the propensity for discovery of novel and innovative solutions to pressing conservation issues.

  5. Qualitative analysis of a discrete thermostatted kinetic framework modeling complex adaptive systems

    Science.gov (United States)

    Bianca, Carlo; Mogno, Caterina

    2018-01-01

    This paper deals with the derivation of a new discrete thermostatted kinetic framework for the modeling of complex adaptive systems subjected to external force fields (nonequilibrium system). Specifically, in order to model nonequilibrium stationary states of the system, the external force field is coupled to a dissipative term (thermostat). The well-posedness of the related Cauchy problem is investigated thus allowing the new discrete thermostatted framework to be suitable for the derivation of specific models and the related computational analysis. Applications to crowd dynamics and future research directions are also discussed within the paper.

  6. Analysis and design of Raptor codes using a multi-edge framework

    OpenAIRE

    Jayasooriya, Sachini; Shirvanimoghaddam, Mahyar; Ong, Lawrence; Johnson, Sarah J.

    2017-01-01

    The focus of this paper is on the analysis and design of Raptor codes using a multi-edge framework. In this regard, we first represent the Raptor code as a multi-edge type low-density parity-check (METLDPC) code. This MET representation gives a general framework to analyze and design Raptor codes over a binary input additive white Gaussian noise channel using MET density evolution (MET-DE). We consider a joint decoding scheme based on the belief propagation (BP) decoding for Raptor codes in t...

  7. Integrating Poverty and Environmental Concerns into Value-Chain Analysis: A Strategic Framework and Practical Guide

    DEFF Research Database (Denmark)

    Riisgaard, Lone; Bolwig, Simon; Ponte, Stefano

    2010-01-01

    This article aims to guide the design and implementation of action-research projects in value-chain analysis by presenting a strategic framework focused on small producers and trading and processing firms in developing countries. Its stepwise approach – building on the conceptual framework set out...... in a companion article – covers in detail what to do, questions to be asked and issues to be considered, and integrates poverty, gender, labour and environmental concerns.'Upgrading' strategies potentially available for improving value-chain participation for small producers are identified, with the ultimate...

  8. Utility green pricing programs: a statistical analysis of program effectiveness

    International Nuclear Information System (INIS)

    Ryan, W.; Scott, O.; Lori, B.; Blair, S.

    2005-01-01

    Utility green pricing programs represent one way in which consumers can voluntarily support the development of renewable energy. The design features and effectiveness of these programs varies considerably. Based on a survey of utility program managers in the United States, this article provides insight into which program features might help maximize both customer participation in green pricing programs and the amount of renewable energy purchased by customers in those programs. We find that program length has a substantial impact on customer participation and purchases; to achieve higher levels of success, utilities will need to remain committed to their product offering for some time. Our findings also suggest that utilities should consider higher renewable energy purchase thresholds for residential customers in order to maximize renewable energy sales. Smaller utilities are found to be more successful than larger utilities, and we find some evidence that providing private benefits to nonresidential participants can enhance success. Interestingly, we find little evidence that the cost of the green pricing product greatly impacts customer participation and renewable energy sales, at least over the narrow range of premiums embedded in our data set, and for the initial set of green power purchasers. (author)

  9. Modelling and Analysis of Real Time Systems with Logic Programming and Constraints

    DEFF Research Database (Denmark)

    Banda, Gourinath

    to verify the reactive behaviour of concur- rent systems. Computation Tree Logic (CTL) is a temporal property specification language. Logic programming is a general purpose programming language based on predicate logic. In this dissertation, the LHA models are verified by encoding them as con- straint logic...... programs. The constraint logic program (CLP) encoding an LHA model is first specialised and then a concrete minimal model (or possibly an ab- stract minimal model) for the residual program is computed. The abstract minimal model is computed by applying the theory of abstract interpretation. The com- puted...... by a compiler. To facilitate forward and backward reasoning, two different ways to model an LHA are defined. A framework consist- ing of general purpose constraint logic program tools is presented to accomplish the reachability analysis to verify a class of safety and liveness properties. A tool to compute...

  10. A program wide framework for evaluating data driven teaching and learning - earth analytics approaches, results and lessons learned

    Science.gov (United States)

    Wasser, L. A.; Gold, A. U.

    2017-12-01

    There is a deluge of earth systems data available to address cutting edge science problems yet specific skills are required to work with these data. The Earth analytics education program, a core component of Earth Lab at the University of Colorado - Boulder - is building a data intensive program that provides training in realms including 1) interdisciplinary communication and collaboration 2) earth science domain knowledge including geospatial science and remote sensing and 3) reproducible, open science workflows ("earth analytics"). The earth analytics program includes an undergraduate internship, undergraduate and graduate level courses and a professional certificate / degree program. All programs share the goals of preparing a STEM workforce for successful earth analytics driven careers. We are developing an program-wide evaluation framework that assesses the effectiveness of data intensive instruction combined with domain science learning to better understand and improve data-intensive teaching approaches using blends of online, in situ, asynchronous and synchronous learning. We are using targeted online search engine optimization (SEO) to increase visibility and in turn program reach. Finally our design targets longitudinal program impacts on participant career tracts over time.. Here we present results from evaluation of both an interdisciplinary undergrad / graduate level earth analytics course and and undergraduate internship. Early results suggest that a blended approach to learning and teaching that includes both synchronous in-person teaching and active classroom hands-on learning combined with asynchronous learning in the form of online materials lead to student success. Further we will present our model for longitudinal tracking of participant's career focus overtime to better understand long-term program impacts. We also demonstrate the impact of SEO optimization on online content reach and program visibility.

  11. A farm-level precision land management framework based on integer programming.

    Science.gov (United States)

    Li, Qi; Hu, Guiping; Jubery, Talukder Zaki; Ganapathysubramanian, Baskar

    2017-01-01

    Farmland management involves several planning and decision making tasks including seed selection and irrigation management. A farm-level precision farmland management model based on mixed integer linear programming is proposed in this study. Optimal decisions are designed for pre-season planning of crops and irrigation water allocation. The model captures the effect of size and shape of decision scale as well as special irrigation patterns. The authors illustrate the model with a case study on a farm in the state of California in the U.S. and show the model can capture the impact of precision farm management on profitability. The results show that threefold increase of annual net profit for farmers could be achieved by carefully choosing irrigation and seed selection. Although farmers could increase profits by applying precision management to seed or irrigation alone, profit increase is more significant if farmers apply precision management on seed and irrigation simultaneously. The proposed model can also serve as a risk analysis tool for farmers facing seasonal irrigation water limits as well as a quantitative tool to explore the impact of precision agriculture.

  12. An unsupervised feature learning framework for basal cell carcinoma image analysis.

    Science.gov (United States)

    Arevalo, John; Cruz-Roa, Angel; Arias, Viviana; Romero, Eduardo; González, Fabio A

    2015-06-01

    The paper addresses the problem of automatic detection of basal cell carcinoma (BCC) in histopathology images. In particular, it proposes a framework to both, learn the image representation in an unsupervised way and visualize discriminative features supported by the learned model. This paper presents an integrated unsupervised feature learning (UFL) framework for histopathology image analysis that comprises three main stages: (1) local (patch) representation learning using different strategies (sparse autoencoders, reconstruct independent component analysis and topographic independent component analysis (TICA), (2) global (image) representation learning using a bag-of-features representation or a convolutional neural network, and (3) a visual interpretation layer to highlight the most discriminant regions detected by the model. The integrated unsupervised feature learning framework was exhaustively evaluated in a histopathology image dataset for BCC diagnosis. The experimental evaluation produced a classification performance of 98.1%, in terms of the area under receiver-operating-characteristic curve, for the proposed framework outperforming by 7% the state-of-the-art discrete cosine transform patch-based representation. The proposed UFL-representation-based approach outperforms state-of-the-art methods for BCC detection. Thanks to its visual interpretation layer, the method is able to highlight discriminative tissue regions providing a better diagnosis support. Among the different UFL strategies tested, TICA-learned features exhibited the best performance thanks to its ability to capture low-level invariances, which are inherent to the nature of the problem. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Supportive supervision and constructive relationships with healthcare workers support CHW performance: Use of a qualitative framework to evaluate CHW programming in Uganda.

    Science.gov (United States)

    Ludwick, Teralynn; Turyakira, Eleanor; Kyomuhangi, Teddy; Manalili, Kimberly; Robinson, Sheila; Brenner, Jennifer L

    2018-02-13

    While evidence supports community health worker (CHW) capacity to improve maternal and newborn health in less-resourced countries, key implementation gaps remain. Tools for assessing CHW performance and evidence on what programmatic components affect performance are lacking. This study developed and tested a qualitative evaluative framework and tool to assess CHW team performance in a district program in rural Uganda. A new assessment framework was developed to collect and analyze qualitative evidence based on CHW perspectives on seven program components associated with effectiveness (selection; training; community embeddedness; peer support; supportive supervision; relationship with other healthcare workers; retention and incentive structures). Focus groups were conducted with four high/medium-performing CHW teams and four low-performing CHW teams selected through random, stratified sampling. Content analysis involved organizing focus group transcripts according to the seven program effectiveness components, and assigning scores to each component per focus group. Four components, 'supportive supervision', 'good relationships with other healthcare workers', 'peer support', and 'retention and incentive structures' received the lowest overall scores. Variances in scores between 'high'/'medium'- and 'low'-performing CHW teams were largest for 'supportive supervision' and 'good relationships with other healthcare workers.' Our analysis suggests that in the Bushenyi intervention context, CHW team performance is highly correlated with the quality of supervision and relationships with other healthcare workers. CHWs identified key performance-related issues of absentee supervisors, referral system challenges, and lack of engagement/respect by health workers. Other less-correlated program components warrant further study and may have been impacted by relatively consistent program implementation within our limited study area. Applying process-oriented measurement tools are

  14. DATA MONITORING AND ANALYSIS PROGRAM MANUAL

    Energy Technology Data Exchange (ETDEWEB)

    Gravois, Melanie

    2007-07-06

    This procedure provides guidelines and techniques for analyzing and trending data using statistical methods for Lawrence Berkeley National Laboratory (LBNL). This procedure outlines the steps used in data analysis and trending. It includes guidelines for performing data analysis and for monitoring (or controlling) processes using performance indicators. This procedure is used when trending and analyzing item characteristics and reliability, process implementation, and other quality-related information to identify items, services, activities, and processes needing improvement, in accordance with 10 CFR Part 830, Subpart A, U.S. Department of Energy (DOE) Order 414.1C, and University of California (UC) Assurance Plan for LBNL. Trend codes, outlined in Attachment 4, are assigned to issues at the time of initiation and entry into the Corrective Action Tracking System (CATS) database in accordance with LBNL/PUB-5519 (1), Issues Management Program Manual. Throughout this procedure, the term performance is used to encompass all aspects of performance including quality, timeliness, efficiency, effectiveness, and reliability. Data analysis tools are appropriate whenever quantitative information describing the performance of an item, service, or process can be obtained.

  15. Critical analysis of e-health readiness assessment frameworks: suitability for application in developing countries.

    Science.gov (United States)

    Mauco, Kabelo Leonard; Scott, Richard E; Mars, Maurice

    2018-02-01

    Introduction e-Health is an innovative way to make health services more effective and efficient and application is increasing worldwide. e-Health represents a substantial ICT investment and its failure usually results in substantial losses in time, money (including opportunity costs) and effort. Therefore it is important to assess e-health readiness prior to implementation. Several frameworks have been published on e-health readiness assessment, under various circumstances and geographical regions of the world. However, their utility for the developing world is unknown. Methods A literature review and analysis of published e-health readiness assessment frameworks or models was performed to determine if any are appropriate for broad assessment of e-health readiness in the developing world. A total of 13 papers described e-health readiness in different settings. Results and Discussion Eight types of e-health readiness were identified and no paper directly addressed all of these. The frameworks were based upon varying assumptions and perspectives. There was no underlying unifying theory underpinning the frameworks. Few assessed government and societal readiness, and none cultural readiness; all are important in the developing world. While the shortcomings of existing frameworks have been highlighted, most contain aspects that are relevant and can be drawn on when developing a framework and assessment tools for the developing world. What emerged is the need to develop different assessment tools for the various stakeholder sectors. This is an area that needs further research before attempting to develop a more generic framework for the developing world.

  16. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss

    7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...... increasingly apparent that approaches which are aimed at ultimately fulfilling standard, quantitative environmental risk assessment for NM is likely to be not only extremely challenging but also resource- and time-consuming. In response, a number of alternative or complimentary frameworks and approaches...

  17. PageRank, HITS and a unified framework for link analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Chris; He, Xiaofeng; Husbands, Parry; Zha, Hongyuan; Simon, Horst

    2001-10-01

    Two popular webpage ranking algorithms are HITS and PageRank. HITS emphasizes mutual reinforcement between authority and hub webpages, while PageRank emphasizes hyperlink weight normalization and web surfing based on random walk models. We systematically generalize/combine these concepts into a unified framework. The ranking framework contains a large algorithm space; HITS and PageRank are two extreme ends in this space. We study several normalized ranking algorithms which are intermediate between HITS and PageRank, and obtain closed-form solutions. We show that, to first order approximation, all ranking algorithms in this framework, including PageRank and HITS, lead to same ranking which is highly correlated with ranking by indegree. These results support the notion that in web resource ranking indegree and outdegree are of fundamental importance. Rankings of webgraphs of different sizes and queries are presented to illustrate our analysis.

  18. Using the framework method for the analysis of qualitative data in multi-disciplinary health research.

    Science.gov (United States)

    Gale, Nicola K; Heath, Gemma; Cameron, Elaine; Rashid, Sabina; Redwood, Sabi

    2013-09-18

    The Framework Method is becoming an increasingly popular approach to the management and analysis of qualitative data in health research. However, there is confusion about its potential application and limitations. The article discusses when it is appropriate to adopt the Framework Method and explains the procedure for using it in multi-disciplinary health research teams, or those that involve clinicians, patients and lay people. The stages of the method are illustrated using examples from a published study. Used effectively, with the leadership of an experienced qualitative researcher, the Framework Method is a systematic and flexible approach to analysing qualitative data and is appropriate for use in research teams even where not all members have previous experience of conducting qualitative research.

  19. Vital analysis: field validation of a framework for annotating biological signals of first responders in action.

    Science.gov (United States)

    Gomes, P; Lopes, B; Coimbra, M

    2012-01-01

    First responders are professionals that are exposed to extreme stress and fatigue during extended periods of time. That is why it is necessary to research and develop technological solutions based on wearable sensors that can continuously monitor the health of these professionals in action, namely their stress and fatigue levels. In this paper we present the Vital Analysis smartphone-based framework, integrated into the broader Vital Responder project, that allows the annotation and contextualization of the signals collected during real action. After a contextual study we have implemented and deployed this framework in a firefighter team with 5 elements, from where we have collected over 3300 hours of annotations during 174 days, covering 382 different events. Results are analysed and discussed, validating the framework as a useful and usable tool for annotating biological signals of first responders in action.

  20. Big Data Analysis Framework for Healthcare and Social Sectors in Korea

    Science.gov (United States)

    Song, Tae-Min

    2015-01-01

    Objectives We reviewed applications of big data analysis of healthcare and social services in developed countries, and subsequently devised a framework for such an analysis in Korea. Methods We reviewed the status of implementing big data analysis of health care and social services in developed countries, and strategies used by the Ministry of Health and Welfare of Korea (Government 3.0). We formulated a conceptual framework of big data in the healthcare and social service sectors at the national level. As a specific case, we designed a process and method of social big data analysis on suicide buzz. Results Developed countries (e.g., the United States, the UK, Singapore, Australia, and even OECD and EU) are emphasizing the potential of big data, and using it as a tool to solve their long-standing problems. Big data strategies for the healthcare and social service sectors were formulated based on an ICT-based policy of current government and the strategic goals of the Ministry of Health and Welfare. We suggest a framework of big data analysis in the healthcare and welfare service sectors separately and assigned them tentative names: 'health risk analysis center' and 'integrated social welfare service network'. A framework of social big data analysis is presented by applying it to the prevention and proactive detection of suicide in Korea. Conclusions There are some concerns with the utilization of big data in the healthcare and social welfare sectors. Thus, research on these issues must be conducted so that sophisticated and practical solutions can be reached. PMID:25705552

  1. Design and architecture of the Mars relay network planning and analysis framework

    Science.gov (United States)

    Cheung, K. M.; Lee, C. H.

    2002-01-01

    In this paper we describe the design and architecture of the Mars Network planning and analysis framework that supports generation and validation of efficient planning and scheduling strategy. The goals are to minimize the transmitting time, minimize the delaying time, and/or maximize the network throughputs. The proposed framework would require (1) a client-server architecture to support interactive, batch, WEB, and distributed analysis and planning applications for the relay network analysis scheme, (2) a high-fidelity modeling and simulation environment that expresses link capabilities between spacecraft to spacecraft and spacecraft to Earth stations as time-varying resources, and spacecraft activities, link priority, Solar System dynamic events, the laws of orbital mechanics, and other limiting factors as spacecraft power and thermal constraints, (3) an optimization methodology that casts the resource and constraint models into a standard linear and nonlinear constrained optimization problem that lends itself to commercial off-the-shelf (COTS)planning and scheduling algorithms.

  2. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    International Nuclear Information System (INIS)

    Agostini, M; Pandola, L; Zavarise, P; Volynets, O

    2011-01-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  3. Using an intervention mapping framework to develop an online mental health continuing education program for pharmacy staff.

    Science.gov (United States)

    Wheeler, Amanda; Fowler, Jane; Hattingh, Laetitia

    2013-01-01

    Current mental health policy in Australia recognizes that ongoing mental health workforce development is crucial to mental health care reform. Community pharmacy staff are well placed to assist people with mental illness living in the community; however, staff require the knowledge and skills to do this competently and effectively. This article presents the systematic planning and development process and content of an education and training program for community pharmacy staff, using a program planning approach called intervention mapping. The intervention mapping framework was used to guide development of an online continuing education program. Interviews with mental health consumers and carers (n = 285) and key stakeholders (n = 15), and a survey of pharmacy staff (n = 504) informed the needs assessment. Program objectives were identified specifying required attitudes, knowledge, skills, and confidence. These objectives were aligned with an education technique and delivery strategy. This was followed by development of an education program and comprehensive evaluation plan. The program was piloted face to face with 24 participants and then translated into an online program comprising eight 30-minute modules for pharmacists, 4 of which were also used for support staff. The evaluation plan provided for online participants (n ≅ 500) to be randomized into intervention (immediate access) or control groups (delayed training access). It included pre- and posttraining questionnaires and a reflective learning questionnaire for pharmacy staff and telephone interviews post pharmacy visit for consumers and carers. An online education program was developed to address mental health knowledge, attitudes, confidence, and skills required by pharmacy staff to work effectively with mental health consumers and carers. Intervention mapping provides a systematic and rigorous approach that can be used to develop a quality continuing education program for the health workforce

  4. Innovation and entrepreneurship programs in US medical education: a landscape review and thematic analysis.

    Science.gov (United States)

    Niccum, Blake A; Sarker, Arnab; Wolf, Stephen J; Trowbridge, Matthew J

    2017-01-01

    Training in innovation and entrepreneurship (I&E) in medical education has become increasingly prevalent among medical schools to train students in complex problem solving and solution design. We aim to characterize I&E education in US allopathic medical schools to provide insight into the features and objectives of this growing field. I&E programs were identified in 2016 via structured searches of 158 US allopathic medical school websites. Program characteristics were identified from public program resources and structured phone interviews with program directors. Curricular themes were identified via thematic analysis of program resources, and themes referenced by >50% of programs were analyzed. Thirteen programs were identified. Programs had a median age of four years, and contained a median of 13 students. Programs were led by faculty from diverse professional backgrounds, and all awarded formal recognition to graduates. Nine programs spanned all four years of medical school and ten programs required a capstone project. Thematic analysis revealed seven educational themes (innovation, entrepreneurship, technology, leadership, healthcare systems, business of medicine, and enhanced adaptability) and two teaching method themes (active learning, interdisciplinary teaching) referenced by >50% of programs. The landscape of medical school I&E programs is rapidly expanding to address newfound skills needed by physicians due to ongoing changes in healthcare, but programs remain relatively few and small compared to class size. This landscape analysis is the first review of I&E in medical education and may contribute to development of a formal educational framework or competency model for current or future programs. AAMC: American Association of Medical Colleges; AMA: American Medical Association; I&E: Innovation and entrepreneurship.

  5. Building a framework for global health learning: an analysis of global health concentrations in Canadian medical schools.

    Science.gov (United States)

    Watterson, Rita; Matthews, David; Bach, Paxton; Kherani, Irfan; Halpine, Mary; Meili, Ryan

    2015-04-01

    This study set out to explore the current state of global health concentrations in Canadian medical schools and to solicit feedback on the barriers and challenges to implementing rigorous global health concentration programs. A set of consensus guidelines for global health concentrations was drafted through consultation with student and faculty leaders across Canada between May 2011 and May 2012. Drawing on these guidelines, a formal survey was sent to prominent faculty at each of the 14 English-speaking Canadian medical schools. A thematic analysis of the results was then conducted. Overall, the guidelines were strongly endorsed. A majority of Canadian medical schools have programs in place to offer global health course work, extracurricular learning opportunities, local community service-learning, low-resource-setting clinical electives, predeparture training, and postreturn debriefing. Although student evaluation, global health mentorship, and knowledge translation projects were endorsed as important components, few schools had been successful in implementing them. Language training for global health remains contested. Other common critiques included a lack of time and resources, and difficulties in setting standards for student evaluation. The results suggest that these guidelines are appropriate and, at least for the major criteria, achievable. Although many Canadian schools offer individual components, the majority of schools have yet to develop formally structured concentration programs. By better articulating guidelines, a standardized framework can aid in the establishment and refinement of future programs.

  6. Does the knowledge-to-action (KTA) framework facilitate physical demands analysis development for firefighter injury management and return-to-work planning?

    Science.gov (United States)

    Sinden, Kathryn; MacDermid, Joy C

    2014-03-01

    Employers are tasked with developing injury management and return-to-work (RTW) programs in response to occupational health and safety policies. Physical demands analyses (PDAs) are the cornerstone of injury management and RTW development. Synthesizing and contextualizing policy knowledge for use in occupational program development, including PDAs, is challenging due to multiple stakeholder involvement. Few studies have used a knowledge translation theoretical framework to facilitate policy-based interventions in occupational contexts. The primary aim of this case study was to identify how constructs of the knowledge-to-action (KTA) framework were reflected in employer stakeholder-researcher collaborations during development of a firefighter PDA. Four stakeholder meetings were conducted with employee participants who had experience using PDAs in their occupational role. Directed content analysis informed analyses of meeting minutes, stakeholder views and personal reflections recorded throughout the case. Existing knowledge sources including local data, stakeholder experiences, policies and priorities were synthesized and tailored to develop a PDA in response to the barriers and facilitators identified by the firefighters. The flexibility of the KTA framework and synthesis of multiple knowledge sources were identified strengths. The KTA Action cycle was useful in directing the overall process but insufficient for directing the specific aspects of PDA development. Integration of specific PDA guidelines into the process provided explicit direction on best practices in tailoring the PDA and knowledge synthesis. Although the themes of the KTA framework were confirmed in our analysis, order modification of the KTA components was required. Despite a complex context with divergent perspectives successful implementation of a draft PDA was achieved. The KTA framework facilitated knowledge synthesis and PDA development but specific standards and modifications to the KTA

  7. Key Interactions for Online Programs between Faculty, Students, Technologies, and Educational Institutions: A Holistic Framework

    Science.gov (United States)

    Paul, Jomon Aliyas; Cochran, Justin Daniel

    2013-01-01

    Online education is becoming increasingly popular among both traditional and nontraditional students. Students gravitate to the flexibility of online courses, which allows them to work around jobs, family, and other responsibilities. While online program growth continues, these programs present several new challenges to educational institutions…

  8. An Analytical Framework for Internationalization through English-Taught Degree Programs: A Dutch Case Study

    Science.gov (United States)

    Kotake, Masako

    2017-01-01

    The growing importance of internationalization and the global dominance of English in higher education mean pressures on expanding English-taught degree programs (ETDPs) in non-English-speaking countries. Strategic considerations are necessary to successfully integrate ETDPs into existing programs and to optimize the effects of…

  9. Mapping Culturally Relevant Pedagogy into Teacher Education Programs: A Critical Framework

    Science.gov (United States)

    Allen, Ayana; Hancock, Stephen D.; W. Lewis, Chance; Starker-Glass, Tehia

    2017-01-01

    Background/Context: Teacher education programs are charged with the daunting task of preparing the next generation of teachers. However, the extant literature has documented that teacher education programs have struggled to effectively arm teacher candidates with effective pedagogies to meet the needs of our increasingly diverse student…

  10. Meteor studies in the framework of the JEM-EUSO program

    Science.gov (United States)

    Abdellaoui, G.; Abe, S.; Acheli, A.; Adams, J. H.; Ahmad, S.; Ahriche, A.; Albert, J.-N.; Allard, D.; Alonso, G.; Anchordoqui, L.; Andreev, V.; Anzalone, A.; Aouimeur, W.; Arai, Y.; Arsene, N.; Asano, K.; Attallah, R.; Attoui, H.; Ave Pernas, M.; Bacholle, S.; Bakiri, M.; Baragatti, P.; Barrillon, P.; Bartocci, S.; Batsch, T.; Bayer, J.; Bechini, R.; Belenguer, T.; Bellotti, R.; Belov, A.; Belov, K.; Benadda, B.; Benmessai, K.; Berlind, A. A.; Bertaina, M.; Biermann, P. L.; Biktemerova, S.; Bisconti, F.; Blanc, N.; Błȩcki, J.; Blin-Bondil, S.; Bobik, P.; Bogomilov, M.; Bonamente, M.; Boudaoud, R.; Bozzo, E.; Briggs, M. S.; Bruno, A.; Caballero, K. S.; Cafagna, F.; Campana, D.; Capdevielle, J.-N.; Capel, F.; Caramete, A.; Caramete, L.; Carlson, P.; Caruso, R.; Casolino, M.; Cassardo, C.; Castellina, A.; Castellini, G.; Catalano, C.; Catalano, O.; Cellino, A.; Chikawa, M.; Chiritoi, G.; Christl, M. J.; Connaughton, V.; Conti, L.; Cordero, G.; Crawford, H. J.; Cremonini, R.; Csorna, S.; Dagoret-Campagne, S.; De Donato, C.; de la Taille, C.; De Santis, C.; del Peral, L.; Di Martino, M.; Djemil, T.; Djenas, S. A.; Dulucq, F.; Dupieux, M.; Dutan, I.; Ebersoldt, A.; Ebisuzaki, T.; Engel, R.; Eser, J.; Fang, K.; Fenu, F.; Fernández-González, S.; Fernández-Soriano, J.; Ferrarese, S.; Finco, D.; Flamini, M.; Fornaro, C.; Fouka, M.; Franceschi, A.; Franchini, S.; Fuglesang, C.; Fujimoto, J.; Fukushima, M.; Galeotti, P.; García-Ortega, E.; Garipov, G.; Gascón, E.; Geary, J.; Gelmini, G.; Genci, J.; Giraudo, G.; Gonchar, M.; González Alvarado, C.; Gorodetzky, P.; Guarino, F.; Guehaz, R.; Guzmán, A.; Hachisu, Y.; Haiduc, M.; Harlov, B.; Haungs, A.; Hernández Carretero, J.; Hidber, W.; Higashide, K.; Ikeda, D.; Ikeda, H.; Inoue, N.; Inoue, S.; Isgrò, F.; Itow, Y.; Jammer, T.; Joven, E.; Judd, E. G.; Jung, A.; Jochum, J.; Kajino, F.; Kajino, T.; Kalli, S.; Kaneko, I.; Kang, D.; Kanouni, F.; Karadzhov, Y.; Karczmarczyk, J.; Karus, M.; Katahira, K.; Kawai, K.; Kawasaki, Y.; Kedadra, A.; Khales, H.; Khrenov, B. A.; Kim, Jeong-Sook; Kim, Soon-Wook; Kim, Sug-Whan; Kleifges, M.; Klimov, P. A.; Kolev, D.; Kreykenbohm, I.; Kudela, K.; Kurihara, Y.; Kusenko, A.; Kuznetsov, E.; Lacombe, M.; Lachaud, C.; Lahmar, H.; Lakhdari, F.; Larsson, O.; Lee, J.; Licandro, J.; Lim, H.; López Campano, L.; Maccarone, M. C.; Mackovjak, S.; Mahdi, M.; Maravilla, D.; Marcelli, L.; Marcos, J. L.; Marini, A.; Martens, K.; Martín, Y.; Martinez, O.; Masciantonio, G.; Mase, K.; Matev, R.; Matthews, J. N.; Mebarki, N.; Medina-Tanco, G.; Mehrad, L.; Mendoza, M. A.; Merino, A.; Mernik, T.; Meseguer, J.; Messaoud, S.; Micu, O.; Mimouni, J.; Miyamoto, H.; Miyazaki, Y.; Mizumoto, Y.; Modestino, G.; Monaco, A.; Monnier-Ragaigne, D.; Morales de los Ríos, J. A.; Moretto, C.; Morozenko, V. S.; Mot, B.; Murakami, T.; Nadji, B.; Nagano, M.; Nagata, M.; Nagataki, S.; Nakamura, T.; Napolitano, T.; Nardelli, A.; Naumov, D.; Nava, R.; Neronov, A.; Nomoto, K.; Nonaka, T.; Ogawa, T.; Ogio, S.; Ohmori, H.; Olinto, A. V.; Orleański, P.; Osteria, G.; Painter, W.; Panasyuk, M. I.; Panico, B.; Parizot, E.; Park, I. H.; Park, H. W.; Pastircak, B.; Patzak, T.; Paul, T.; Pennypacker, C.; Perdichizzi, M.; Pérez-Grande, I.; Perfetto, F.; Peter, T.; Picozza, P.; Pierog, T.; Pindado, S.; Piotrowski, L. W.; Piraino, S.; Placidi, L.; Plebaniak, Z.; Pliego, S.; Pollini, A.; Popescu, E. M.; Prat, P.; Prévôt, G.; Prieto, H.; Putis, M.; Rabanal, J.; Radu, A. A.; Rahmani, M.; Reardon, P.; Reyes, M.; Rezazadeh, M.; Ricci, M.; Rodríguez Frías, M. D.; Ronga, F.; Roth, M.; Rothkaehl, H.; Roudil, G.; Rusinov, I.; Rybczyński, M.; Sabau, M. D.; Sáez Cano, G.; Sagawa, H.; Sahnoune, Z.; Saito, A.; Sakaki, N.; Sakata, M.; Salazar, H.; Sanchez, J. C.; Sánchez, J. L.; Santangelo, A.; Santiago Crúz, L.; Sanz-Andrés, A.; Sanz Palomino, M.; Saprykin, O.; Sarazin, F.; Sato, H.; Sato, M.; Schanz, T.; Schieler, H.; Scotti, V.; Segreto, A.; Selmane, S.; Semikoz, D.; Serra, M.; Sharakin, S.; Shibata, T.; Shimizu, H. M.; Shinozaki, K.; Shirahama, T.; Siemieniec-Oziȩbło, G.; Sledd, J.; Słomińska, K.; Sobey, A.; Stan, I.; Sugiyama, T.; Supanitsky, D.; Suzuki, M.; Szabelska, B.; Szabelski, J.; Tahi, H.; Tajima, F.; Tajima, N.; Tajima, T.; Takahashi, Y.; Takami, H.; Takeda, M.; Takizawa, Y.; Talai, M. C.; Tenzer, C.; Tibolla, O.; Tkachev, L.; Tokuno, H.; Tomida, T.; Tone, N.; Toscano, S.; Traïche, M.; Tsenov, R.; Tsunesada, Y.; Tsuno, K.; Tymieniecka, T.; Uchihori, Y.; Unger, M.; Vaduvescu, O.; Valdés-Galicia, J. F.; Vallania, P.; Vankova, G.; Vigorito, C.; Villaseñor, L.; Vlcek, B.; von Ballmoos, P.; Vrabel, M.; Wada, S.; Watanabe, J.; Watanabe, S.; Watts, J., Jr.; Weber, M.; Weigand Muñoz, R.; Weindl, A.; Weiler, T. J.; Wibig, T.; Wiencke, L.; Wille, M.; Wilms, J.; Włodarczyk, Z.; Yamamoto, T.; Yamamoto, Y.; Yang, J.; Yano, H.; Yashin, I. V.; Yonetoku, D.; Yoshida, S.; Young, R.; Zgura, I. S.; Zotov, M. Yu.; Zuccaro Marchi, A.

    2017-09-01

    We summarize the state of the art of a program of UV observations from space of meteor phenomena, a secondary objective of the JEM-EUSO international collaboration. Our preliminary analysis indicates that JEM-EUSO, taking advantage of its large FOV and good sensitivity, should be able to detect meteors down to absolute magnitude close to 7. This means that JEM-EUSO should be able to record a statistically significant flux of meteors, including both sporadic ones, and events produced by different meteor streams. Being unaffected by adverse weather conditions, JEM-EUSO can also be a very important facility for the detection of bright meteors and fireballs, as these events can be detected even in conditions of very high sky background. In the case of bright events, moreover, exhibiting some persistence of the meteor train, preliminary simulations show that it should be possible to exploit the motion of the ISS itself and derive at least a rough 3D reconstruction of the meteor trajectory. Moreover, the observing strategy developed to detect meteors may also be applied to the detection of nuclearites, exotic particles whose existence has been suggested by some theoretical investigations. Nuclearites are expected to move at higher velocities than meteoroids, and to exhibit a wider range of possible trajectories, including particles moving upward after crossing the Earth. Some pilot studies, including the approved Mini-EUSO mission, a precursor of JEM-EUSO, are currently operational or in preparation. We are doing simulations to assess the performance of Mini-EUSO for meteor studies, while a few meteor events have been already detected using the ground-based facility EUSO-TA.

  11. Adaptive Fuzzy Consensus Clustering Framework for Clustering Analysis of Cancer Data.

    Science.gov (United States)

    Yu, Zhiwen; Chen, Hantao; You, Jane; Liu, Jiming; Wong, Hau-San; Han, Guoqiang; Li, Le

    2015-01-01

    Performing clustering analysis is one of the important research topics in cancer discovery using gene expression profiles, which is crucial in facilitating the successful diagnosis and treatment of cancer. While there are quite a number of research works which perform tumor clustering, few of them considers how to incorporate fuzzy theory together with an optimization process into a consensus clustering framework to improve the performance of clustering analysis. In this paper, we first propose a random double clustering based cluster ensemble framework (RDCCE) to perform tumor clustering based on gene expression data. Specifically, RDCCE generates a set of representative features using a randomly selected clustering algorithm in the ensemble, and then assigns samples to their corresponding clusters based on the grouping results. In addition, we also introduce the random double clustering based fuzzy cluster ensemble framework (RDCFCE), which is designed to improve the performance of RDCCE by integrating the newly proposed fuzzy extension model into the ensemble framework. RDCFCE adopts the normalized cut algorithm as the consensus function to summarize the fuzzy matrices generated by the fuzzy extension models, partition the consensus matrix, and obtain the final result. Finally, adaptive RDCFCE (A-RDCFCE) is proposed to optimize RDCFCE and improve the performance of RDCFCE further by adopting a self-evolutionary process (SEPP) for the parameter set. Experiments on real cancer gene expression profiles indicate that RDCFCE and A-RDCFCE works well on these data sets, and outperform most of the state-of-the-art tumor clustering algorithms.

  12. Establishing a framework for a physician assistant/bioethics dual degree program.

    Science.gov (United States)

    Carr, Mark F; Bergman, Brett A

    2014-01-01

    : Numerous medical schools currently offer a master of arts (MA) in bioethics dual degree for physicians. A degree in bioethics enhances the care physicians provide to patients and prepares physicians to serve on ethics committees and consult services. Additionally, they may work on institutional and public policy issues related to ethics. Several physician assistant (PA) programs currently offer a master of public health (MPH) dual degree for PAs. A degree in public health prepares PAs for leadership roles in meeting community health needs. With the success of PA/MPH dual degree programs, we argue here that a PA/bioethics dual degree would be another opportunity to advance the PA profession and consider how such a program might be implemented. The article includes the individual perspectives of the authors, one of whom completed a graduate-level certificate in bioethics concurrently with his 2-year PA program, while the other served as a bioethics program director.

  13. Wide Area Recovery and Resiliency Program (WARRP) Attachment 2 - All-Hazards Regional Recovery Framework Template

    Science.gov (United States)

    2012-11-01

    Chemical Terrorism , Biological Terrorism, Radiological Terrorism, Lessons Learned 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 38 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 Contents [Insert contents and lists of tables and figures once framework has been completed] 3 Executive Summary [Summarize the purpose,

  14. Can Programming Frameworks Bring Smartphones into the Mainstream of Psychological Science?

    OpenAIRE

    Piwek, L; Ellis, DA; Andrews, S

    2016-01-01

    Smartphones continue to provide huge potential for psychological science and the advent of novel research frameworks brings new opportunities for researchers who have previously struggled to develop smartphone applications. However, despite this renewed promise, smartphones have failed to become a standard item within psychological research. Here we consider the key issues that continue to limit smartphone adoption within psychological science and how these barriers might be diminishing in li...

  15. Mississippi Curriculum Framework for Computer Information Systems Technology. Computer Information Systems Technology (Program CIP: 52.1201--Management Information Systems & Business Data). Computer Programming (Program CIP: 52.1201). Network Support (Program CIP: 52.1290--Computer Network Support Technology). Postsecondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…

  16. Recent advances in metal-organic frameworks and covalent organic frameworks for sample preparation and chromatographic analysis.

    Science.gov (United States)

    Wang, Xuan; Ye, Nengsheng

    2017-12-01

    In the field of analytical chemistry, sample preparation and chromatographic separation are two core procedures. The means by which to improve the sensitivity, selectivity and detection limit of a method have become a topic of great interest. Recently, porous organic frameworks, such as metal-organic frameworks (MOFs) and covalent organic frameworks (COFs), have been widely used in this research area because of their special features, and different methods have been developed. This review summarizes the applications of MOFs and COFs in sample preparation and chromatographic stationary phases. The MOF- or COF-based solid-phase extraction (SPE), solid-phase microextraction (SPME), gas chromatography (GC), high-performance liquid chromatography (HPLC) and capillary electrochromatography (CEC) methods are described. The excellent properties of MOFs and COFs have resulted in intense interest in exploring their performance and mechanisms for sample preparation and chromatographic separation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Programming Social Applications Building Viral Experiences with OpenSocial, OAuth, OpenID, and Distributed Web Frameworks

    CERN Document Server

    LeBlanc, Jonathan

    2011-01-01

    Social networking has made one thing clear: websites and applications need to provide users with experiences tailored to their preferences. This in-depth guide shows you how to build rich social frameworks, using open source technologies and specifications. You'll learn how to create third-party applications for existing sites, build engaging social graphs, and develop products to host your own socialized experience. Programming Social Apps focuses on the OpenSocial platform, along with Apache Shindig, OAuth, OpenID, and other tools, demonstrating how they work together to help you solve pra

  18. Development of an Artificial Intelligence Programming Course and Unity3d Based Framework to Motivate Learning in Artistic Minded Students

    DEFF Research Database (Denmark)

    Reng, Lars

    2012-01-01

    between technical and artistic minded students is, however, increased once the students reach the sixth semester. The complex algorithms of the artificial intelligence course seemed to demotivate the artistic minded students even before the course began. This paper will present the extensive changes made...... to the sixth semester artificial intelligence programming course, in order to provide a highly motivating direct visual feedback, and thereby remove the steep initial learning curve for artistic minded students. The framework was developed with close dialog to both the game industry and experienced master...

  19. Mississippi Curriculum Framework for Marketing and Fashion Merchandising (Program CIP: 08.0705--General Retailing Operations). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for marketing I-II and fashion merchandising. Presented first are a program…

  20. Mississippi Curriculum Framework for Diesel Equipment Repair & Service (Program CIP: 47.0605--Diesel Engine Mechanic & Repairer). Secondary Programs.

    Science.gov (United States)

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which reflects Mississippi's statutory requirement that instructional programs be based on core curricula and performance-based assessment, contains outlines of the instructional units required in local instructional management plans and daily lesson plans for diesel engine mechanics I and II. Presented first are a program…

  1. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  2. Multicriteria diversity analysis. A novel heuristic framework for appraising energy portfolios

    Energy Technology Data Exchange (ETDEWEB)

    Stirling, Andy [SPRU - Science and Technology Policy Research, Freeman Centre, University of Sussex, Sussex BN1 9QE (United Kingdom)

    2010-04-15

    This paper outlines a novel general framework for analysing energy diversity. A critical review of different reasons for policy interest reveals that diversity is more than a supply security strategy. There are particular synergies with strategies for transitions to sustainability. Yet - despite much important work - policy analysis tends to address only a subset of the properties of diversity and remains subject to ambiguity, neglect and special pleading. Developing earlier work, the paper proposes a more comprehensive heuristic framework, accommodating a wide range of different disciplinary and socio-political perspectives. It is argued that the associated multicriteria diversity analysis method provides a more systematic, complete and transparent way to articulate disparate perspectives and approaches and so help to inform more robust and accountable policymaking. (author)

  3. A generic framework for the description and analysis of energy security in an energy system

    International Nuclear Information System (INIS)

    Hughes, Larry

    2012-01-01

    While many energy security indicators and models have been developed for specific jurisdictions or types of energy, few can be considered sufficiently generic to be applicable to any energy system. This paper presents a framework that attempts to meet this objective by combining the International Energy Agency's definition of energy security with structured systems analysis techniques to create three energy security indicators and a process-flow energy systems model. The framework is applicable to those energy systems which can be described in terms of processes converting or transporting flows of energy to meet the energy–demand flows from downstream processes. Each process affects the environment and is subject to jurisdictional policies. The framework can be employed to capture the evolution of energy security in an energy system by analyzing the results of indicator-specific metrics applied to the energy, demand, and environment flows associated with the system's constituent processes. Energy security policies are treated as flows to processes and classified into one of three actions affecting the process's energy demand or the process or its energy input, or both; the outcome is determined by monitoring changes to the indicators. The paper includes a detailed example of an application of the framework. - Highlights: ► The IEA's definition of energy security is parsed into three energy security indicators: availability, affordability, and acceptability. ► Data flow diagrams and other systems analysis tools can represent an energy system and its processes, flows, and chains. ► Indicator-specific metrics applied to a process's flow determine the state of energy security in an energy system, an energy chain, or process. ► Energy policy is considered as a flow and policy outcomes are obtained by measuring flows with indicator-specific metrics. ► The framework is applicable to most jurisdictions and energy types.

  4. Modeling Phase-transitions Using a High-performance, Isogeometric Analysis Framework

    KAUST Repository

    Vignal, Philippe

    2014-06-06

    In this paper, we present a high-performance framework for solving partial differential equations using Isogeometric Analysis, called PetIGA, and show how it can be used to solve phase-field problems. We specifically chose the Cahn-Hilliard equation, and the phase-field crystal equation as test cases. These two models allow us to highlight some of the main advantages that we have access to while using PetIGA for scientific computing.

  5. Solving nonlinear, High-order partial differential equations using a high-performance isogeometric analysis framework

    KAUST Repository

    Cortes, Adriano Mauricio

    2014-01-01

    In this paper we present PetIGA, a high-performance implementation of Isogeometric Analysis built on top of PETSc. We show its use in solving nonlinear and time-dependent problems, such as phase-field models, by taking advantage of the high-continuity of the basis functions granted by the isogeometric framework. In this work, we focus on the Cahn-Hilliard equation and the phase-field crystal equation.

  6. An Automated Bayesian Framework for Integrative Gene Expression Analysis and Predictive Medicine

    OpenAIRE

    Parikh, Neena; Zollanvari, Amin; Alterovitz, Gil

    2012-01-01

    Motivation: This work constructs a closed loop Bayesian Network framework for predictive medicine via integrative analysis of publicly available gene expression findings pertaining to various diseases. Results: An automated pipeline was successfully constructed. Integrative models were made based on gene expression data obtained from GEO experiments relating to four different diseases using Bayesian statistical methods. Many of these models demonstrated a high level of accuracy and predictive...

  7. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  8. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A framework of analysis for field experiments with alternative materials in road construction.

    Science.gov (United States)

    François, D; Jullien, A

    2009-01-01

    In France, a wide variety of alternative materials is produced or exists in the form of stockpiles built up over time. Such materials are distributed over various regions of the territory depending on local industrial development and urbanisation trends. The use of alternative materials at a national scale implies sharing local knowledge and experience. Building a national database on alternative materials for road construction is useful in gathering and sharing information. An analysis of feedback from onsite experiences (back analysis) is essential to improve knowledge on alternative material use in road construction. Back analysis of field studies has to be conducted in accordance with a single common framework. This could enable drawing comparisons between alternative materials and between road applications. A framework for the identification and classification of data used in back analyses is proposed. Since the road structure is an open system, this framework has been based on a stress-response approach at both the material and structural levels and includes a description of external factors applying during the road service life. The proposal has been shaped from a review of the essential characteristics of road materials and structures, as well as from the state of knowledge specific to alternative material characterisation.

  10. A Novel Framework for Interactive Visualization and Analysis of Hyperspectral Image Data

    Directory of Open Access Journals (Sweden)

    Johannes Jordan

    2016-01-01

    Full Text Available Multispectral and hyperspectral images are well established in various fields of application like remote sensing, astronomy, and microscopic spectroscopy. In recent years, the availability of new sensor designs, more powerful processors, and high-capacity storage further opened this imaging modality to a wider array of applications like medical diagnosis, agriculture, and cultural heritage. This necessitates new tools that allow general analysis of the image data and are intuitive to users who are new to hyperspectral imaging. We introduce a novel framework that bundles new interactive visualization techniques with powerful algorithms and is accessible through an efficient and intuitive graphical user interface. We visualize the spectral distribution of an image via parallel coordinates with a strong link to traditional visualization techniques, enabling new paradigms in hyperspectral image analysis that focus on interactive raw data exploration. We combine novel methods for supervised segmentation, global clustering, and nonlinear false-color coding to assist in the visual inspection. Our framework coined Gerbil is open source and highly modular, building on established methods and being easily extensible for application-specific needs. It satisfies the need for a general, consistent software framework that tightly integrates analysis algorithms with an intuitive, modern interface to the raw image data and algorithmic results. Gerbil finds its worldwide use in academia and industry alike with several thousand downloads originating from 45 countries.

  11. UNC-Utah NA-MIC Framework for DTI Fiber Tract Analysis

    Directory of Open Access Journals (Sweden)

    Audrey Rose Verde

    2014-01-01

    Full Text Available Diffusion tensor imaging has become an important modality in the field ofneuroimaging to capture changes in micro-organization and to assess white matterintegrity or development. While there exists a number of tractography toolsets,these usually lack tools for preprocessing or to analyze diffusion properties alongthe fiber tracts. Currently, the field is in critical need of a coherent end-to-endtoolset for performing an along-fiber tract analysis, accessible to non-technicalneuroimaging researchers. The UNC-Utah NA-MIC DTI framework represents acoherent, open source, end-to-end toolset for atlas fiber tract based DTI analysisencompassing DICOM data conversion, quality control, atlas building, fibertractography, fiber parameterization, and statistical analysis of diffusionproperties. Most steps utilize graphical user interfaces (GUI to simplifyinteraction and provide an extensive DTI analysis framework for non-technicalresearchers/investigators. We illustrate the use of our framework on a smallsample, cross sectional neuroimaging study of 8 healthy 1-year-old children fromthe Infant Brain Imaging Study (IBIS Network. In this limited test study, weillustrate the power of our method by quantifying the diffusion properties at 1year of age on the genu and splenium fiber tracts.

  12. A Comprehensive Database and Analysis Framework To Incorporate Multiscale Data Types and Enable Integrated Analysis of Bioactive Polyphenols.

    Science.gov (United States)

    Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M

    2018-03-05

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  13. Crisis Reliability Indicators Supporting Emergency Services (CRISES): A Framework for Developing Performance Measures for Behavioral Health Crisis and Psychiatric Emergency Programs.

    Science.gov (United States)

    Balfour, Margaret E; Tanner, Kathleen; Jurica, Paul J; Rhoads, Richard; Carson, Chris A

    2016-01-01

    Crisis and emergency psychiatric services are an integral part of the healthcare system, yet there are no standardized measures for programs providing these services. We developed the Crisis Reliability Indicators Supporting Emergency Services (CRISES) framework to create measures that inform internal performance improvement initiatives and allow comparison across programs. The framework consists of two components-the CRISES domains (timely, safe, accessible, least-restrictive, effective, consumer/family centered, and partnership) and the measures supporting each domain. The CRISES framework provides a foundation for development of standardized measures for the crisis field. This will become increasingly important as pay-for-performance initiatives expand with healthcare reform.

  14. A decision analysis framework to support long-term planning for nuclear fuel cycle technology research, development, demonstration and deployment

    International Nuclear Information System (INIS)

    Sowder, A.G.; Machiels, A.J.; Dykes, A.A.; Johnson, D.H.

    2013-01-01

    To address challenges and gaps in nuclear fuel cycle option assessment and to support research, develop and demonstration programs oriented toward commercial deployment, EPRI (Electric Power Research Institute) is seeking to develop and maintain an independent analysis and assessment capability by building a suite of assessment tools based on a platform of software, simplified relationships, and explicit decision-making and evaluation guidelines. As a demonstration of the decision-support framework, EPRI examines a relatively near-term fuel cycle option, i.e., use of reactor-grade mixed-oxide fuel (MOX) in U.S. light water reactors. The results appear as a list of significant concerns (like cooling of spent fuels, criticality risk...) that have to be taken into account for the final decision

  15. Validation of a Framework for Measuring Hospital Disaster Resilience Using Factor Analysis

    Directory of Open Access Journals (Sweden)

    Shuang Zhong

    2014-06-01

    Full Text Available Hospital disaster resilience can be defined as “the ability of hospitals to resist, absorb, and respond to the shock of disasters while maintaining and surging essential health services, and then to recover to its original state or adapt to a new one.” This article aims to provide a framework which can be used to comprehensively measure hospital disaster resilience. An evaluation framework for assessing hospital resilience was initially proposed through a systematic literature review and Modified-Delphi consultation. Eight key domains were identified: hospital safety, command, communication and cooperation system, disaster plan, resource stockpile, staff capability, disaster training and drills, emergency services and surge capability, and recovery and adaptation. The data for this study were collected from 41 tertiary hospitals in Shandong Province in China, using a specially designed questionnaire. Factor analysis was conducted to determine the underpinning structure of the framework. It identified a four-factor structure of hospital resilience, namely, emergency medical response capability (F1, disaster management mechanisms (F2, hospital infrastructural safety (F3, and disaster resources (F4. These factors displayed good internal consistency. The overall level of hospital disaster resilience (F was calculated using the scoring model: F = 0.615F1 + 0.202F2 + 0.103F3 + 0.080F4. This validated framework provides a new way to operationalise the concept of hospital resilience, and it is also a foundation for the further development of the measurement instrument in future studies.

  16. Development of an Analysis and Design Optimization Framework for Marine Propellers

    Science.gov (United States)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  17. A framework for understanding outcomes of integrated care programs for the hospitalized elderly

    Directory of Open Access Journals (Sweden)

    Jacqueline M. Hartgerink

    2013-11-01

    Full Text Available Introduction: Integrated care has emerged as a new strategy to enhance the quality of care for hospitalised elderly. Current models do not provide insight into the mechanisms underlying integrated care delivery. Therefore, we developed a framework to identify the underlying mechanisms of integrated care delivery. We should understand how they operate and interact, so that integrated care programmes can enhance the quality of care and eventually patient outcomes.Theory and methods: Interprofessional collaboration among professionals is considered to be critical in integrated care delivery due to many interdependent work requirements. A review of integrated care components brings to light a distinction between the cognitive and behavioural components of interprofessional collaboration.Results: Effective integrated care programmes combine the interacting components of care delivery. These components affect professionals’ cognitions and behaviour, which in turn affect quality of care. Insight is gained into how these components alter the way care is delivered through mechanisms such as combining individual knowledge and actively seeking new information.Conclusion: We expect that insight into the cognitive and behavioural mechanisms will contribute to the understanding of integrated care programmes. The framework can be used to identify the underlying mechanisms of integrated care responsible for producing favourable outcomes, allowing comparisons across programmes.

  18. Comparing, optimizing, and benchmarking quantum-control algorithms in a unifying programming framework

    International Nuclear Information System (INIS)

    Machnes, S.; Sander, U.; Glaser, S. J.; Schulte-Herbrueggen, T.; Fouquieres, P. de; Gruslys, A.; Schirmer, S.

    2011-01-01

    For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions are pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.

  19. A framework for understanding outcomes of integrated care programs for the hospitalized elderly

    Directory of Open Access Journals (Sweden)

    Jacqueline M. Hartgerink

    2013-11-01

    Full Text Available Introduction: Integrated care has emerged as a new strategy to enhance the quality of care for hospitalised elderly. Current models do not provide insight into the mechanisms underlying integrated care delivery. Therefore, we developed a framework to identify the underlying mechanisms of integrated care delivery. We should understand how they operate and interact, so that integrated care programmes can enhance the quality of care and eventually patient outcomes. Theory and methods: Interprofessional collaboration among professionals is considered to be critical in integrated care delivery due to many interdependent work requirements. A review of integrated care components brings to light a distinction between the cognitive and behavioural components of interprofessional collaboration. Results: Effective integrated care programmes combine the interacting components of care delivery. These components affect professionals’ cognitions and behaviour, which in turn affect quality of care. Insight is gained into how these components alter the way care is delivered through mechanisms such as combining individual knowledge and actively seeking new information. Conclusion: We expect that insight into the cognitive and behavioural mechanisms will contribute to the understanding of integrated care programmes. The framework can be used to identify the underlying mechanisms of integrated care responsible for producing favourable outcomes, allowing comparisons across programmes.

  20. Prevention validation and accounting platform: a framework for establishing accountability and performance measures of substance abuse prevention programs.

    Science.gov (United States)

    Kim, S; McLeod, J H; Williams, C; Hepler, N

    2000-01-01

    The field of substance abuse prevention has neither an overarching conceptual framework nor a set of shared terminologies for establishing the accountability and performance outcome measures of substance abuse prevention services rendered. Hence, there is a wide gap between what we currently have as data on one hand and information that are required to meet the performance goals and accountability measures set by the Government Performance and Results Act of 1993 on the other. The task before us is: How can we establish the accountability and performance measures of substance abuse prevention programs and transform the field of prevention into prevention science? The intent of this volume is to serve that purpose and accelerate the processes of this transformation by identifying the requisite components of the transformation (i.e., theory, methodology, convention on terms, and data) and by introducing an open forum called, Prevention Validation and Accounting (PREVA) Platform. The entire PREVA Platform (for short, the Platform) is designed as an analytic framework, which is formulated by a collectivity of common concepts, terminologies, accounting units, protocols for counting the units, data elements, and operationalizations of various constructs, and other summary measures intended to bring about an efficient and effective measurement of process input, program capacity, process output, performance outcome, and societal impact of substance abuse prevention programs. The measurement units and summary data elements are designed to be measured across time and across jurisdictions, i.e., from local to regional to state to national levels. In the Platform, the process input is captured by two dimensions of time and capital. Time is conceptualized in terms of service delivery time and time spent for research and development. Capital is measured by the monies expended for the delivery of program activities during a fiscal or reporting period. Program capacity is captured

  1. From fatalism to mitigation: A conceptual framework for mitigating fetal programming of chronic disease by maternal obesity.

    Science.gov (United States)

    Boone-Heinonen, Janne; Messer, Lynne C; Fortmann, Stephen P; Wallack, Lawrence; Thornburg, Kent L

    2015-12-01

    Prenatal development is recognized as a critical period in the etiology of obesity and cardiometabolic disease. Potential strategies to reduce maternal obesity-induced risk later in life have been largely overlooked. In this paper, we first propose a conceptual framework for the role of public health and preventive medicine in mitigating the effects of fetal programming. Second, we review a small but growing body of research (through August 2015) that examines interactive effects of maternal obesity and two public health foci - diet and physical activity - in the offspring. Results of the review support the hypothesis that diet and physical activity after early life can attenuate disease susceptibility induced by maternal obesity, but human evidence is scant. Based on the review, we identify major gaps relevant for prevention research, such as characterizing the type and dose response of dietary and physical activity exposures that modify the adverse effects of maternal obesity in the offspring. Third, we discuss potential implications of interactions between maternal obesity and postnatal dietary and physical activity exposures for interventions to mitigate maternal obesity-induced risk among children. Our conceptual framework, evidence review, and future research directions offer a platform to develop, test, and implement fetal programming mitigation strategies for the current and future generations of children. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. A framework for 2-stage global sensitivity analysis of GastroPlus™ compartmental models.

    Science.gov (United States)

    Scherholz, Megerle L; Forder, James; Androulakis, Ioannis P

    2018-04-01

    Parameter sensitivity and uncertainty analysis for physiologically based pharmacokinetic (PBPK) models are becoming an important consideration for regulatory submissions, requiring further evaluation to establish the need for global sensitivity analysis. To demonstrate the benefits of an extensive analysis, global sensitivity was implemented for the GastroPlus™ model, a well-known commercially available platform, using four example drugs: acetaminophen, risperidone, atenolol, and furosemide. The capabilities of GastroPlus were expanded by developing an integrated framework to automate the GastroPlus graphical user interface with AutoIt and for execution of the sensitivity analysis in MATLAB ® . Global sensitivity analysis was performed in two stages using the Morris method to screen over 50 parameters for significant factors followed by quantitative assessment of variability using Sobol's sensitivity analysis. The 2-staged approach significantly reduced computational cost for the larger model without sacrificing interpretation of model behavior, showing that the sensitivity results were well aligned with the biopharmaceutical classification system. Both methods detected nonlinearities and parameter interactions that would have otherwise been missed by local approaches. Future work includes further exploration of how the input domain influences the calculated global sensitivity measures as well as extending the framework to consider a whole-body PBPK model.

  3. Integrated predictive maintenance program vibration and lube oil analysis: Part I - history and the vibration program

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, H.

    1996-12-01

    This paper is the first of two papers which describe the Predictive Maintenance Program for rotating machines at the Palo Verde Nuclear Generating Station. The organization has recently been restructured and significant benefits have been realized by the interaction, or {open_quotes}synergy{close_quotes} between the Vibration Program and the Lube Oil Analysis Program. This paper starts with the oldest part of the program - the Vibration Program and discusses the evolution of the program to its current state. The {open_quotes}Vibration{close_quotes} view of the combined program is then presented.

  4. Integrated predictive maintenance program vibration and lube oil analysis: Part I - history and the vibration program

    International Nuclear Information System (INIS)

    Maxwell, H.

    1996-01-01

    This paper is the first of two papers which describe the Predictive Maintenance Program for rotating machines at the Palo Verde Nuclear Generating Station. The organization has recently been restructured and significant benefits have been realized by the interaction, or open-quotes synergyclose quotes between the Vibration Program and the Lube Oil Analysis Program. This paper starts with the oldest part of the program - the Vibration Program and discusses the evolution of the program to its current state. The open-quotes Vibrationclose quotes view of the combined program is then presented

  5. Sustainability assessment of nuclear power: Discourse analysis of IAEA and IPCC frameworks

    International Nuclear Information System (INIS)

    Verbruggen, Aviel; Laes, Erik

    2015-01-01

    Highlights: • Sustainability assessments (SAs) are methodologically precarious. • Discourse analysis reveals how the meaning of sustainability is constructed in SAs. • Discourse analysis is applied on the SAs of nuclear power of IAEA and IPCC. • For IAEA ‘sustainable’ equals ‘complying with best international practices’. • The IAEA framework largely inspires IPCC Fifth Assessment Report. - Abstract: Sustainability assessments (SAs) are methodologically precarious. Value-based judgments inevitably play a role in setting the scope of the SA, selecting assessment criteria and indicators, collecting adequate data, and developing and using models of considered systems. Discourse analysis can reveal how the meaning and operationalization of sustainability is constructed in and through SAs. Our discourse-analytical approach investigates how sustainability is channeled from ‘manifest image’ (broad but shallow), to ‘vision’, to ‘policy targets’ (specific and practical). This approach is applied on the SA frameworks used by IAEA and IPCC to assess the sustainability of the nuclear power option. The essentially problematic conclusion is that both SA frameworks are constructed in order to obtain answers that do not conflict with prior commitments adopted by the two institutes. For IAEA ‘sustainable’ equals ‘complying with best international practices and standards’. IPCC wrestles with its mission as a provider of “policy-relevant and yet policy-neutral, never policy-prescriptive” knowledge to decision-makers. IPCC avoids the assessment of different visions on the role of nuclear power in a low-carbon energy future, and skips most literature critical of nuclear power. The IAEA framework largely inspires IPCC AR5

  6. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    Science.gov (United States)

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  7. Capacity building program: Framework of Standards to secure and facilitate Global Trade

    International Nuclear Information System (INIS)

    Koech, H.K.

    2010-01-01

    Effective implementation of capacity building program in Kenya will result in maximum protection against terrorist activity/counter terrorism worldwide due to countries meeting the requirements of the program via safety and security measures at land borders, seaports, and airports. It will also result in enforcement of illegal trade pertaining to terrorist financing, money laundering, trade fraud, strategic cases including weapons of mass destruction, child pornography, intellectual property rights, document fraud, alien smuggling, drug smuggling, and general smuggling. It will also facilitate legitimate commerce.

  8. A prototype of the CMS Object Oriented Reconstruction and Analysis Framework for the Beam Test Data

    CERN Document Server

    Silvestris, L

    1998-01-01

    CMS software requirements and computing resources will by far exceed those of any existing high energy physics experiment, not only for the complexity of the detector and of the physics task but also for the size of the collaboration and the long time scale. Therefore, software should be developed keeping in mind not only performance but also modularity, flexibility, maintainability, quality assurance and documentation. Object Orientation has been identified as the enabling technology, since it directly addresses this kind of problems. We will report on the development of an Object Oriented Reconstruction and Analysis Framework for the CMS experiment and in particular on a prototype of a complete analysis chain for the CMS test-beam data. The analysis chain consists of three different components: data acquisition, reconstruction and analysis, and interactive analysis tools. In the online part the data, read from the VME, are stored into an Objectivity federated database. Later, using an automatic procedure, t...

  9. Back to the basics: identifying positive youth development as the theoretical framework for a youth drug prevention program in rural Saskatchewan, Canada amidst a program evaluation.

    Science.gov (United States)

    Dell, Colleen Anne; Duncan, Charles Randy; DesRoches, Andrea; Bendig, Melissa; Steeves, Megan; Turner, Holly; Quaife, Terra; McCann, Chuck; Enns, Brett

    2013-10-22

    Despite endorsement by the Saskatchewan government to apply empirically-based approaches to youth drug prevention services in the province, programs are sometimes delivered prior to the establishment of evidence-informed goals and objectives. This paper shares the 'preptory' outcomes of our team's program evaluation of the Prince Albert Parkland Health Region Mental Health and Addiction Services' Outreach Worker Service (OWS) in eight rural, community schools three years following its implementation. Before our independent evaluation team could assess whether expectations of the OWS were being met, we had to assist with establishing its overarching program goals and objectives and 'at-risk' student population, alongside its alliance with an empirically-informed theoretical framework. A mixed-methods approach was applied, beginning with in-depth focus groups with the OWS staff to identify the program's goals and objectives and targeted student population. These were supplemented with OWS and school administrator interviews and focus groups with school staff. Alignment with a theoretical focus was determined though a review of the OWS's work to date and explored in focus groups between our evaluation team and the OWS staff and validated with the school staff and OWS and school administration. With improved understanding of the OWS's goals and objectives, our evaluation team and the OWS staff aligned the program with the Positive Youth Development theoretical evidence-base, emphasizing the program's universality, systems focus, strength base, and promotion of assets. Together we also gained clarity about the OWS's definition of and engagement with its 'at-risk' student population. It is important to draw on expert knowledge to develop youth drug prevention programming, but attention must also be paid to aligning professional health care services with a theoretically informed evidence-base for evaluation purposes. If time does not permit for the establishment of

  10. Exploring the ICF-CY as a framework to inform transition programs from pediatric to adult healthcare.

    Science.gov (United States)

    Hartman, Laura R; McPherson, Amy C; Maxwell, Joanne; Lindsay, Sally

    2017-05-23

    To explore the utility of the International Classification of Functioning, Disability and Health-Children and Youth Version (ICF-CY) for informing transition-related programs for youth with chronic conditions moving into adult healthcare settings, using an exemplar spina bifida program. Semi-structured in-depth interviews were conducted with 53 participants (9 youth and 11 parents who participated in a spina bifida transition program, 12 young adults who did not, 12 clinicians, and 9 key informants involved in development/implementation). Interview transcripts were thematically analyzed, and then further coded using ICF-CY domain codes. ICF-CY domains captured many key areas regarding individuals" transitions to adult care and adult functioning, but did not fully capture concepts of transition program experience, independence, and parents" role. The ICF-CY framework captures some experiences of transitions to adult care, but should be considered in conjunction with other models that address issues outside of the domains covered by the ICF-CY.

  11. Developing a holistic framework for analysis of destination management and/or marketing organizations

    DEFF Research Database (Denmark)

    Jørgensen, Matias Thuen

    2017-01-01

    and organizations. The analysis reveals a number of important factors, including whether the DMOs are focused on survival or development, on experiences or communication, and on internally or externally oriented governance. Finally, it reveals that Danish DMOs constantly negotiate between their various roles......This paper presents a holistic framework for analysis of destination management and/or marketing organizations (DMOs) and explores how these work in a highly complex tourism environment. Six destinations are investigated through 61 qualitative interviews with representatives from tourism businesses...

  12. 2005 Mississippi Curriculum Framework: Secondary Cosmetology. (Program CIP: 12.0401 - Cosmetology/Cosmetologist, General)

    Science.gov (United States)

    Buchanon, Rouser; Farmer, Helen

    2005-01-01

    Secondary vocational-technical education programs in Mississippi are faced with many challenges resulting from sweeping educational reforms at the national and state levels. Schools and teachers are increasingly being held accountable for providing true learning activities to every student in the classroom. This accountability is measured through…

  13. The Development of a Collegiate Recovery Program: Applying Social Cognitive Theory within a Social Ecological Framework

    Science.gov (United States)

    Beeson, Eric T.; Whitney, Jennifer M.; Peterson, Holly M.

    2017-01-01

    Background: Collegiate recovery programs (CRPs) are emerging as a strategy to provide after-care support to students in recovery from substance use disorders (SUDs) at institutions of higher education. CRPs are an innovative strategy for Health Educators to support the personal, academic, and professional goals of students in recovery. Purpose:…

  14. Exploring Success-Based Learning as an Instructional Framework in Principal Preparatory Programs

    Science.gov (United States)

    Schecter, Chen

    2008-01-01

    The professional expertise of educational leaders has been defined through the lens of problem-solving processes. Problem-based learning has therefore become an increasingly popular instructional approach in principal preparatory programs. As such, this study represents an initial attempt to explore learning from success (i.e., success-based…

  15. Learning to Redesign Teacher Education: A Conceptual Framework to Support Program Change

    Science.gov (United States)

    Anagnostopoulos, Dorothea; Levine, Thomas; Roselle, Rene; Lombardi, Allison

    2018-01-01

    University-based teacher education faces intensifying pressure to prove its effectiveness. This has prompted renewed interest in program redesign. In this article, we argue that enacting meaningful redesign requires university-based teacher educators to learn new ways of thinking and acting not only with teacher candidates but also with their…

  16. Evaluation of a diabetes care program using the effective coverage framework.

    Science.gov (United States)

    López-López, Erika; Gutiérrez-Soria, David; Idrovo, Alvaro J

    2012-12-01

    To measure the effective coverage of a program to control type 2 diabetes. Observational study combining multiple Hidalgo state, Mexico. Adults without social security health benefits and patients with a diagnosis of diabetes participating in the program. Detection of diabetes; glucose, cholesterol, triglyceride and blood pressure control; education; diabetic retinopathy, diabetic foot and nephropathy prevention. Only 7.1% of individuals with diabetes participated in the control program. Fasting glucose and HbA1c values were available for 95.6 and 35.6 of patients, respectively. There were measurements of total cholesterol (52.1%), triglyceride (50.6%) and blood pressure (99.6%). Educative activities were realized for 64.8% of patients. The most important gaps were related with detection of illness, low-density lipoprotein cholesterol control, glucose control with HbA1c and nephropathy prevention. Effective coverage of these medical actions was 6.22, 5.07, 5.01 and 0.34%, respectively. The greatest challenge to overcome is the detection of individuals with illness because a large number of individuals with type 2 diabetes do not use health services and the health system does not systematically search them out. Medical actions that require resources that must be paid for by patients tend to be used less and to be of lower quality. The use of effective coverage to measure the performance of diabetes care program provides practical information to improve health services.

  17. A Framework for the Game-theoretic Analysis of Censorship Resistance

    Directory of Open Access Journals (Sweden)

    Elahi Tariq

    2016-10-01

    Full Text Available We present a game-theoretic analysis of optimal solutions for interactions between censors and censorship resistance systems (CRSs by focusing on the data channel used by the CRS to smuggle clients’ data past the censors. This analysis leverages the inherent errors (false positives and negatives made by the censor when trying to classify traffic as either non-circumvention traffic or as CRS traffic, as well as the underlying rate of CRS traffic. We identify Nash equilibrium solutions for several simple censorship scenarios and then extend those findings to more complex scenarios where we find that the deployment of a censorship apparatus does not qualitatively change the equilibrium solutions, but rather only affects the amount of traffic a CRS can support before being blocked. By leveraging these findings, we describe a general framework for exploring and identifying optimal strategies for the censorship circumventor, in order to maximize the amount of CRS traffic not blocked by the censor. We use this framework to analyze several scenarios with multiple data-channel protocols used as cover for the CRS. We show that it is possible to gain insights through this framework even without perfect knowledge of the censor’s (secret values for the parameters in their utility function.

  18. A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure

    Directory of Open Access Journals (Sweden)

    Yingjie Xia

    2013-01-01

    Full Text Available Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs, which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI, by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.

  19. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  20. Experiences of paradox: a qualitative analysis of living with cancer using a framework approach.

    Science.gov (United States)

    Leal, Isabel; Engebretson, Joan; Cohen, Lorenzo; Rodriguez, Alma; Wangyal, Tenzin; Lopez, Gabriel; Chaoul, Alejandro

    2015-02-01

    Life-threatening diseases such as cancer represent unique traumas-compared with singular, time-limited traumatic events-given their multidimensional, uncertain, and continuing nature. However, few studies have examined the impact of cancer on patients as a persistent stressor. The aim of this qualitative study is to explore patients' ongoing experiences of living with cancer and the changes encountered in this experience over time. Written reflections to three open-ended questions collected from 28 patients on their experience of cancer at two time points were analyzed to explore participants' experiences and perspectives over time. Content analysis using a framework approach was employed to code, categorize, and summarize data into a thematic framework. Data analysis yielded the thematic framework-living with paradox, consisting of four interrelated themes: sources, experiences, resolution of paradox, and challenges with medical culture/treatment. The primary theme concerned moving through a dualistic and complex cancer experience of concurrently negative and positive emotional states across the course of cancer. Respondents indicated that cycling through this contradictory trajectory was neither linear, nor singular, nor conclusive in nature, but reiterative across time. Recognition that patients' cancer experience may be paradoxical and tumultuous throughout the cancer trajectory can influence how practitioners provide patients with needed support during diagnosis, treatment, and recovery. This also has implications for interventions, treatment, and care plans, and adequately responding to the diversity of patient's psychosocial, physical, existential, and spiritual experience of illness. Copyright © 2014 John Wiley & Sons, Ltd.