WorldWideScience

Sample records for methods performance approach

  1. Total System Performance Assessment-License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2002-09-13

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issue (KTI) agreements, the ''Yucca Mountain Review Plan'' (CNWRA 2002 [158449]), and 10 CFR Part 63. This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are utilized in this document.

  2. Total System Performance Assessment - License Application Methods and Approach

    International Nuclear Information System (INIS)

    McNeish, J.

    2003-01-01

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document

  3. Total System Performance Assessment - License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2003-12-08

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document.

  4. Performance analysis of demodulation with diversity -- A combinatorial approach I: Symmetric function theoretical methods

    Directory of Open Access Journals (Sweden)

    Jean-Louis Dornstetter

    2002-12-01

    Full Text Available This paper is devoted to the presentation of a combinatorial approach, based on the theory of symmetric functions, for analyzing the performance of a family of demodulation methods used in mobile telecommunications.

  5. Performance analysis of demodulation with diversity -- A combinatorial approach I: Symmetric function theoretical methods

    OpenAIRE

    Jean-Louis Dornstetter; Daniel Krob; Jean-Yves Thibon; Ekaterina A. Vassilieva

    2002-01-01

    This paper is devoted to the presentation of a combinatorial approach, based on the theory of symmetric functions, for analyzing the performance of a family of demodulation methods used in mobile telecommunications.

  6. Technical Efficiency and Organ Transplant Performance: A Mixed-Method Approach

    Science.gov (United States)

    de-Pablos-Heredero, Carmen; Fernández-Renedo, Carlos; Medina-Merodio, Jose-Amelio

    2015-01-01

    Mixed methods research is interesting to understand complex processes. Organ transplants are complex processes in need of improved final performance in times of budgetary restrictions. As the main objective a mixed method approach is used in this article to quantify the technical efficiency and the excellence achieved in organ transplant systems and to prove the influence of organizational structures and internal processes in the observed technical efficiency. The results show that it is possible to implement mechanisms for the measurement of the different components by making use of quantitative and qualitative methodologies. The analysis show a positive relationship between the levels related to the Baldrige indicators and the observed technical efficiency in the donation and transplant units of the 11 analyzed hospitals. Therefore it is possible to conclude that high levels in the Baldrige indexes are a necessary condition to reach an increased level of the service offered. PMID:25950653

  7. Cognitive Task Complexity Effects on L2 Writing Performance: An Application of Mixed-Methods Approaches

    Science.gov (United States)

    Abdi Tabari, Mahmoud; Ivey, Toni A.

    2015-01-01

    This paper provides a methodological review of previous research on cognitive task complexity, since the term emerged in 1995, and investigates why much research was more quantitative rather than qualitative. Moreover, it sheds light onto the studies which used the mixed-methods approach and determines which version of the mixed-methods designs…

  8. Do Robot Performance and Behavioral Style affect Human Trust? : A Multi-Method Approach

    NARCIS (Netherlands)

    van den Brule, Rik; Dotsch, Ron; Bijlstra, Gijsbert; Wigboldus, D.H.J.; Haselager, Pim

    2014-01-01

    An important aspect of a robot’s social behavior is to convey the right amount of trustworthiness. Task performance has shown to be an important source for trustworthiness judgments. Here, we argue that factors such as a robot’s behavioral style can play an important role as well. Our approach to

  9. Performance and separation occurrence of binary probit regression estimator using maximum likelihood method and Firths approach under different sample size

    Science.gov (United States)

    Lusiana, Evellin Dewi

    2017-12-01

    The parameters of binary probit regression model are commonly estimated by using Maximum Likelihood Estimation (MLE) method. However, MLE method has limitation if the binary data contains separation. Separation is the condition where there are one or several independent variables that exactly grouped the categories in binary response. It will result the estimators of MLE method become non-convergent, so that they cannot be used in modeling. One of the effort to resolve the separation is using Firths approach instead. This research has two aims. First, to identify the chance of separation occurrence in binary probit regression model between MLE method and Firths approach. Second, to compare the performance of binary probit regression model estimator that obtained by MLE method and Firths approach using RMSE criteria. Those are performed using simulation method and under different sample size. The results showed that the chance of separation occurrence in MLE method for small sample size is higher than Firths approach. On the other hand, for larger sample size, the probability decreased and relatively identic between MLE method and Firths approach. Meanwhile, Firths estimators have smaller RMSE than MLEs especially for smaller sample sizes. But for larger sample sizes, the RMSEs are not much different. It means that Firths estimators outperformed MLE estimator.

  10. An Investigation into Native and Non-Native Teachers' Judgments of Oral English Performance: A Mixed Methods Approach

    Science.gov (United States)

    Kim, Youn-Hee

    2009-01-01

    This study used a mixed methods research approach to examine how native English-speaking (NS) and non-native English-speaking (NNS) teachers assess students' oral English performance. The evaluation behaviors of two groups of teachers (12 Canadian NS teachers and 12 Korean NNS teachers) were compared with regard to internal consistency, severity,…

  11. Methods for implementing Building Information Modeling and Building Performance Simulation approaches

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø

    methodologies. Thesis studies showed that BIM approaches have the potential to improve AEC/FM communication and collaboration. BIM is by its nature multidisciplinary, bringing AEC/FM project participants together and creating constant communication. However, BIM adoption can lead to technical challenges......, Engineering, Construction, and Facility Management (AEC/ FM) communication, and (b) BPS as a platform for early-stage building performance prediction. The second is to develop (a) relevant AEC/FM communication support instruments, and (b) standardized BIM and BPS execution guidelines and information exchange......, for example, getting BIM-compatible tools to communicate properly. Furthermore, BIM adoption requires organizational change, that is changes in AEC/FM work practices and interpersonal dynamics. Consequently, to ensure that the adoption of BIM is successful, it is recommended that common IT regulations...

  12. A Mixed-Method Approach on Digital Educational Games for K12: Gender, Attitudes and Performance

    Science.gov (United States)

    Law, Effie Lai-Chong; Gamble, Tim; Schwarz, Daniel; Kickmeier-Rust, Michael D.; Holzinger, Andreas

    Research on the influence of gender on attitudes towards and performance in digital educational games (DEGs) has quite a long history. Generally, males tend to play such games more engagingly than females, consequently attitude and performance of males using DEGs should be presumably higher than that of females. This paper reports an investigation of a DEG, which was developed to enhance the acquisition of geographical knowledge, carried out on British, German and Austrian K12 students aged between 11 and 14. Methods include a survey on initial design concepts, user tests on the system and two single-gender focus groups. Gender and cultural differences in gameplay habit, game type preferences and game character perceptions were observed. The results showed that both genders similarly improved their geographical knowledge, although boys tended to have a higher level of positive user experience than the girls. The qualitative data from the focus groups illustrated some interesting gender differences in perceiving various aspects of the game.

  13. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo; David I Gertman

    2012-06-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handling of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.

  14. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    International Nuclear Information System (INIS)

    Hugo, Jacques; Gertman, David I.

    2012-01-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handling of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.

  15. Approaches to chronic disease management evaluation in use in Europe: a review of current methods and performance measures.

    Science.gov (United States)

    Conklin, Annalijn; Nolte, Ellen; Vrijhoef, Hubertus

    2013-01-01

    An overview was produced of approaches currently used to evaluate chronic disease management in selected European countries. The study aims to describe the methods and metrics used in Europe as a first to help advance the methodological basis for their assessment. A common template for collection of evaluation methods and performance measures was sent to key informants in twelve European countries; responses were summarized in tables based on template evaluation categories. Extracted data were descriptively analyzed. Approaches to the evaluation of chronic disease management vary widely in objectives, designs, metrics, observation period, and data collection methods. Half of the reported studies used noncontrolled designs. The majority measure clinical process measures, patient behavior and satisfaction, cost and utilization; several also used a range of structural indicators. Effects are usually observed over 1 or 3 years on patient populations with a single, commonly prevalent, chronic disease. There is wide variation within and between European countries on approaches to evaluating chronic disease management in their objectives, designs, indicators, target audiences, and actors involved. This study is the first extensive, international overview of the area reported in the literature.

  16. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach.

    Science.gov (United States)

    Wimmers, Paul F; Fung, Cha-Chi

    2008-06-01

    The finding of case or content specificity in medical problem solving moved the focus of research away from generalisable skills towards the importance of content knowledge. However, controversy about the content dependency of clinical performance and the generalisability of skills remains. This study aimed to explore the relative impact of both perspectives (case specificity and generalisable skills) on different components (history taking, physical examination, communication) of clinical performance within and across cases. Data from a clinical performance examination (CPX) taken by 350 Year 3 students were used in a correlated traits-correlated methods (CTCM) approach using confirmatory factor analysis, whereby 'traits' refers to generalisable skills and 'methods' to individual cases. The baseline CTCM model was analysed and compared with four nested models using structural equation modelling techniques. The CPX consisted of three skills components and five cases. Comparison of the four different models with the least-restricted baseline CTCM model revealed that a model with uncorrelated generalisable skills factors and correlated case-specific knowledge factors represented the data best. The generalisable processes found in history taking, physical examination and communication were responsible for half the explained variance, in comparison with the variance related to case specificity. Conclusions Pure knowledge-based and pure skill-based perspectives on clinical performance both seem too one-dimensional and new evidence supports the idea that a substantial amount of variance contributes to both aspects of performance. It could be concluded that generalisable skills and specialised knowledge go hand in hand: both are essential aspects of clinical performance.

  17. A Performance Prediction Method for Pumps as Turbines (PAT Using a Computational Fluid Dynamics (CFD Modeling Approach

    Directory of Open Access Journals (Sweden)

    Emma Frosina

    2017-01-01

    Full Text Available Small and micro hydropower systems represent an attractive solution for generating electricity at low cost and with low environmental impact. The pump-as-turbine (PAT approach has promise in this application due to its low purchase and maintenance costs. In this paper, a new method to predict the inverse characteristic of industrial centrifugal pumps is presented. This method is based on results of simulations performed with commercial three-dimensional Computational Fluid Dynamics (CFD software. Model results have been first validated in pumping mode using data supplied by pump manufacturers. Then, the results have been compared to experimental data for a pump running in reverse. Experimentation has been performed on a dedicated test bench installed in the Department of Civil Construction and Environmental Engineering of the University of Naples Federico II. Three different pumps, with different specific speeds, have been analyzed. Using the model results, the inverse characteristic and the best efficiency point have been evaluated. Finally, results have been compared to prediction methods available in the literature.

  18. Evaluation of a multi-methods approach to the collection and dissemination of feedback on OSCE performance in dental education.

    Science.gov (United States)

    Wardman, M J; Yorke, V C; Hallam, J L

    2018-05-01

    Feedback is an essential part of the learning process, and students expect their feedback to be personalised, meaningful and timely. Objective Structured Clinical Examination (OSCE) assessments allow examiners to observe students carefully over the course of a number of varied station types, across a number of clinical knowledge and skill domains. They therefore present an ideal opportunity to record detailed feedback which allows students to reflect on and improve their performance. This article outlines two methods by which OSCE feedback was collected and then disseminated to undergraduate dental students across 2-year groups in a UK dental school: (i) Individual written feedback comments made by examiners during the examination, (ii) General audio feedback recorded by groups of examiners immediately following the examination. Evaluation of the feedback was sought from students and staff examiners. A multi-methods approach utilising Likert questionnaire items (quantitative) and open-ended feedback questions (qualitative) was used. Data analysis explored student and staff perceptions of the audio and written feedback. A total of 131 students (response rate 68%) and 52 staff examiners (response rate 83%) completed questionnaires. Quantitative data analysis showed that the written and audio formats were reported as a meaningful source of feedback for learning by both students (93% written, 89% audio) and staff (96% written, 92% audio). Qualitative data revealed the complementary nature of both types of feedback. Written feedback gives specific, individual information whilst audio shares general observations and allows students to learn from others. The advantages, limitations and challenges of the feedback methods are discussed, leading to the development of an informed set of implementation guidelines. Written and audio feedback methods are valued by students and staff. It is proposed that these may be very easily applied to OSCEs running in other dental schools.

  19. Approaches to chronic disease management evaluation in use in Europe : A review of current methods and performance measures

    NARCIS (Netherlands)

    Conklin, A.; Nolte, E.; Vrijhoef, H.J.M.

    2013-01-01

    Objectives: An overview was produced of approaches currently used to evaluate chronic disease management in selected European countries. The study aims to describe the methods and metrics used in Europe as a first to help advance the methodological basis for their assessment. Methods: A common

  20. Performative Schizoid Method

    DEFF Research Database (Denmark)

    Svabo, Connie

    2016-01-01

    is presented and an example is provided of a first exploratory engagement with it. The method is used in a specific project Becoming Iris, making inquiry into arts-based knowledge creation during a three month visiting scholarship at a small, independent visual art academy. Using the performative schizoid......A performative schizoid method is developed as a method contribution to performance as research. The method is inspired by contemporary research in the human and social sciences urging experimentation and researcher engagement with creative and artistic practice. In the article, the method...... method in Becoming Iris results in four audio-visual and performance-based productions, centered on an emergent theme of the scholartist as a bird in borrowed feathers. Interestingly, the moral lesson of the fable about the vain jackdaw, who dresses in borrowed peacock feathers and becomes a castout...

  1. Methods to stimulate national and sub-national benchmarking through international health system performance comparisons: a Canadian approach.

    Science.gov (United States)

    Veillard, Jeremy; Moses McKeag, Alexandra; Tipper, Brenda; Krylova, Olga; Reason, Ben

    2013-09-01

    This paper presents, discusses and evaluates methods used by the Canadian Institute for Health Information to present health system performance international comparisons in ways that facilitate their understanding by the public and health system policy-makers and can stimulate performance benchmarking. We used statistical techniques to normalize the results and present them on a standardized scale facilitating understanding of results. We compared results to the OECD average, and to benchmarks. We also applied various data quality rules to ensure the validity of results. In order to evaluate the impact of the public release of these results, we used quantitative and qualitative methods and documented other types of impact. We were able to present results for performance indicators and dimensions at national and sub-national levels; develop performance profiles for each Canadian province; and show pan-Canadian performance patterns for specific performance indicators. The results attracted significant media attention at national level and reactions from various stakeholders. Other impacts such as requests for additional analysis and improvement in data timeliness were observed. The methods used seemed attractive to various audiences in the Canadian context and achieved the objectives originally defined. These methods could be refined and applied in different contexts. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  2. Performance evaluation and ranking of direct sales stores using BSC approach and fuzzy multiple attribute decision-making methods

    Directory of Open Access Journals (Sweden)

    Mojtaba Soltannezhad Dizaji

    2017-07-01

    Full Text Available In an environment where markets go through a volatile process, and rapid fundamental changes occur due to technological advances, it is important to ensure and maintain a good performance measurement. Organizations, in their performance evaluation, should consider different types of financial and non-financial indicators. In systems like direct sales stores in which decision units have multiple inputs and outputs, all criteria influencing on performance must be combined and examined in a system, simultaneously. The purpose of this study is to evaluate the performance of different products through direct sales of a firm named Shirin Asal with a combination of Balanced Scorecard, fuzzy AHP and TOPSIS so that the weaknesses of subjectivity and selective consideration of evaluators in evaluating the performance indicators are reduced and evaluation integration is provided by considering the contribution of each indicator and each indicator group of balanced scorecard. The research method of this case study is applied. The data collection method is a questionnaire from the previous studies, the use of experts' opinions and the study of documents in the organization. MATLAB and SPSS were used to analyze the data. During this study, the customer and financial perspectives are of the utmost importance to assess the company branches. Among the sub-criteria, the rate of new customer acquisition in the customer dimension and the net income to sales ratio in financial dimension are of the utmost importance.

  3. Approach to the assessment of the performance of nondestructive test methods in the manufacture of nuclear power station equipment

    International Nuclear Information System (INIS)

    Michaut, J.P.

    1996-01-01

    The safety of a nuclear power station lies largely in the possibility of ensuring, at the time of in service inspections on major equipment, that the extent of faults which may appear or develop is not greater than that of faults detrimental to behavior in service. This assurance is based on performance demonstration of the nondestructive test methods used for inspecting the equipment in service. This is the subject of numerous studies in various countries. To ensure that manufacturing faults likely to downgrade the safety of the equipment are not discovered in service, it seems desirable to make sure that the performance of the nondestructive test (NDT) methods which are going to be used in manufacture will be at least as high as those used in service and that they are therefore capable of guaranteeing detection of faults clearly less important than really harmful faults. The performance of NDT methods and their consistency with those which can be used in service is evaluated before the start of manufacture on a mock-up representative of the equipment itself. Information is given on research in progress on the bimetal welding of a pressurizer spray nozzle

  4. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    Science.gov (United States)

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  5. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods

    Directory of Open Access Journals (Sweden)

    Gavin J. Nixon

    2014-12-01

    Full Text Available Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR. There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These ‘isothermal’ methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT, akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  6. Performance evaluation and ranking of direct sales stores using BSC approach and fuzzy multiple attribute decision-making methods

    OpenAIRE

    Mojtaba Soltannezhad Dizaji; Mohammad Mahdavi Mazdeh; Ahmad Makui

    2017-01-01

    In an environment where markets go through a volatile process, and rapid fundamental changes occur due to technological advances, it is important to ensure and maintain a good performance measurement. Organizations, in their performance evaluation, should consider different types of financial and non-financial indicators. In systems like direct sales stores in which decision units have multiple inputs and outputs, all criteria influencing on performance must be combined and examined in a syst...

  7. Weighting Performance Evaluation Criteria Base in Balanced Score Card Approach with Use of Combination Method Shapley value & Bull\\'s-eye

    Directory of Open Access Journals (Sweden)

    Mohammad Hassan Kamfiroozi

    2014-05-01

    Full Text Available Performance evaluation as a control tool was considered by managers in the organizations and manufactures. In this paper we decide to present a new model for performance evaluation and industrial companies ranking at uncertain conditions. Based on this, we implemented performance evaluation based on balance score card (BSC method. Beside, we tried to use three parameter interval grey numbers in lieu of linguistic variables. Then evaluation and weighting of fourth indicators is done with use of Bulls-eye-Shapley combination method that is counted as new approach in this article. Reason of utilization of three parameter interval grey numbers and combination method was decreasing of environmental uncertainty on data and model. This combination weighting method can be used as a new method in decision making Science. At final of this paper case study was implemented on industrial companies (nail makers that ranking of these companies is obtained by use of grey-TOPSIS method (that is a generalization of classic TOPSIS for three parameter interval grey numbers.

  8. Systemic Approach to Architectural Performance

    Directory of Open Access Journals (Sweden)

    Marie Davidova

    2017-04-01

    Full Text Available First-hand experiences in several design projects that were based on media richness and collaboration are described in this article. Although complex design processes are merely considered as socio-technical systems, they are deeply involved with natural systems. My collaborative research in the field of performance-oriented design combines digital and physical conceptual sketches, simulations and prototyping. GIGA-mapping - is applied to organise the data. The design process uses the most suitable tools, for the subtasks at hand, and the use of media is mixed according to particular requirements. These tools include digital and physical GIGA-mapping, parametric computer aided design (CAD, digital simulation of analyses, as well as sampling and 1:1 prototyping. Also discussed in this article are the methodologies used in several design projects to strategize these tools and the developments and trends in the tools employed.  The paper argues that the digital tools tend to produce similar results through given pre-sets that often do not correspond to real needs. Thus, there is a significant need for mixed methods including prototyping in the creative design process. Media mixing and cooperation across disciplines is unavoidable in the holistic approach to contemporary design. This includes the consideration of diverse biotic and abiotic agents. I argue that physical and digital GIGA-mapping is a crucial tool to use in coping with this complexity. Furthermore, I propose the integration of physical and digital outputs in one GIGA-map and the participation and co-design of biotic and abiotic agents into one rich design research space, which is resulting in an ever-evolving research-design process-result time-based design.

  9. Experimental Study Comparing a Traditional Approach to Performance Appraisal Training to a Whole-Brain Training Method at C.B. Fleet Laboratories

    Science.gov (United States)

    Selden, Sally; Sherrier, Tom; Wooters, Robert

    2012-01-01

    The purpose of this study is to examine the effects of a new approach to performance appraisal training. Motivated by split-brain theory and existing studies of cognitive information processing and performance appraisals, this exploratory study examined the effects of a whole-brain approach to training managers for implementing performance…

  10. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    Science.gov (United States)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  11. Approaches towards airport economic performance measurement

    Directory of Open Access Journals (Sweden)

    Ivana STRYČEKOVÁ

    2011-01-01

    Full Text Available The paper aims to assess how economic benchmarking is being used by airports as a means of performance measurement and comparison of major international airports in the world. The study focuses on current benchmarking practices and methods by taking into account different factors according to which it is efficient to benchmark airports performance. As methods are considered mainly data envelopment analysis and stochastic frontier analysis. Apart from them other approaches are discussed by airports to provide economic benchmarking. The main objective of this article is to evaluate the efficiency of the airports and answer some undetermined questions involving economic benchmarking of the airports.

  12. Quality by design approach in the development of an ultra-high-performance liquid chromatography method for Bexsero meningococcal group B vaccine.

    Science.gov (United States)

    Nompari, Luca; Orlandini, Serena; Pasquini, Benedetta; Campa, Cristiana; Rovini, Michele; Del Bubba, Massimo; Furlanetto, Sandra

    2018-02-01

    Bexsero is the first approved vaccine for active immunization of individuals from 2 months of age and older to prevent invasive disease caused by Neisseria meningitidis serogroup B. The active components of the vaccine are Neisseria Heparin Binding Antigen, factor H binding protein, Neisseria adhesin A, produced in Escherichia coli cells by recombinant DNA technology, and Outer Membrane Vesicles (expressing Porin A and Porin B), produced by fermentation of Neisseria meningitidis strain NZ98/254. All the Bexsero active components are adsorbed on aluminum hydroxide and the unadsorbed antigens content is a product critical quality attribute. In this paper the development of a fast, selective and sensitive ultra-high-performance liquid chromatography (UHPLC) method for the determination of the Bexsero antigens in the vaccine supernatant is presented. For the first time in the literature, the Quality by Design (QbD) principles were applied to the development of an analytical method aimed to the quality control of a vaccine product. The UHPLC method was fully developed within the QbD framework, the new paradigm of quality outlined in International Conference on Harmonisation guidelines. Critical method attributes (CMAs) were identified with the capacity factor of Neisseria Heparin Binding Antigen, antigens resolution and peak areas. After a scouting phase, aimed at selecting a suitable and fast UHPLC operative mode for the vaccine antigens separation, risk assessment tools were employed to define the critical method parameters to be considered in the screening phase. Screening designs were applied for investigating at first the effects of vial type and sample concentration, and then the effects of injection volume, column type, organic phase starting concentration, ramp time and temperature. Response Surface Methodology pointed out the presence of several significant interaction effects, and with the support of Monte-Carlo simulations led to map out the design space, at

  13. Freight performance measures : approach analysis.

    Science.gov (United States)

    2010-05-01

    This report reviews the existing state of the art and also the state of the practice of freight performance measurement. Most performance measures at the state level have aimed at evaluating highway or transit infrastructure performance with an empha...

  14. COMPANY PERFORMANCE MEASUREMENT AND REPORTING METHODS

    Directory of Open Access Journals (Sweden)

    Nicu Ioana Elena

    2012-12-01

    Full Text Available One of the priorities of economic research has been and remains the re-evaluation of the notion of performance and especially exploring and finding some indicators that would reflect as accurately as possible the subtleties of the economic entity. The main purpose of this paper is to highlight the main company performance measurement and reporting methods. Performance is a concept that raises many question marks concerning the most accurate or the best method of reporting the performance at the company level. The research methodology has aimed at studying the Romanian and foreign specialized literature dealing with the analyzed field, studying magazines specialized on company performance measurement. If the financial performance measurement indicators are considered to offer an accurate image of the situation of the company, the modern approach through non-financial indicators offers a new perspective upon performance measurement, which is based on simplicity. In conclusion, after the theoretical study, I have noticed that the methods of performance measurement, reporting and interpretation are various, the opinions regarding the best performance measurement methods are contradictive and the companies prefer resorting to financial indicators that still play a more important role in the consolidation of the company performance measurement than the non-financial indicators do.

  15. Development and validation of a rapid ultra-high performance liquid chromatography method for the assay of benzalkonium chloride using a quality-by-design approach.

    Science.gov (United States)

    Mallik, Rangan; Raman, Srividya; Liang, Xiaoli; Grobin, Adam W; Choudhury, Dilip

    2015-09-25

    A rapid robust reversed-phase UHPLC method has been developed for the analysis of total benzalkonium chloride in preserved drug formulation. A systematic Quality-by-Design (QbD) method development approach using commercial, off the shelf software (Fusion AE(®)) has been used to optimize the column, mobile phases, gradient time, and other HPLC conditions. Total benzalkonium chloride analysis involves simple sample preparation. The method uses gradient elution from an ACE Excel 2 C18-AR column (50mm×2.1mm, 2.0μm particle size), ammonium phosphate buffer (pH 3.3; 10mM) as aqueous mobile phase and methanol/acetonitrile (85/15, v/v) as the organic mobile phase with UV detection at 214nm. Using these conditions, major homologs of the benzalkonium chloride (C12 and C14) have been separated in less than 2.0min. The validation results confirmed that the method is precise, accurate and linear at concentrations ranging from 0.025mg/mL to 0.075mg/mL for total benzalkonium chloride. The recoveries ranged from 99% to 103% at concentrations from 0.025mg/mL to 0.075mg/mL for total benzalkonium chloride. The validation results also confirmed the robustness of the method as predicted by Fusion AE(®). Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Enhanced Portfolio Performance Using a Momentum Approach to Annual Rebalancing

    Directory of Open Access Journals (Sweden)

    Michael D. Mattei

    2018-02-01

    Full Text Available After diversification, periodic portfolio rebalancing has become one of the most widely practiced methods for reducing portfolio risk and enhancing returns. Most of the rebalancing strategies found in the literature are generally regarded as contrarian approaches to rebalancing. A recent article proposed a rebalancing approach that incorporates a momentum approach to rebalancing. The momentum approach had a better risk adjusted return than either the traditional approach or a Buy-and-Hold approach. This article identifies an improvement to the momentum approach and then examines the impact of transactions costs and taxes on the portfolio performance of four active rebalancing approaches.

  17. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S; Antoniou, I; Dahlberg, J A [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  18. Validation of an ultra-high-performance liquid chromatography-tandem mass spectrometry method to quantify illicit drug and pharmaceutical residues in wastewater using accuracy profile approach.

    Science.gov (United States)

    Hubert, Cécile; Roosen, Martin; Levi, Yves; Karolak, Sara

    2017-06-02

    The analysis of biomarkers in wastewater has become a common approach to assess community behavior. This method is an interesting way to estimate illicit drug consumption in a given population: by using a back calculation method, it is therefore possible to quantify the amount of a specific drug used in a community and to assess the consumption variation at different times and locations. Such a method needs reliable analytical data since the determination of a concentration in the ngL -1 range in a complex matrix is difficult and not easily reproducible. The best analytical method is liquid chromatography - mass spectrometry coupling after solid-phase extraction or on-line pre-concentration. Quality criteria are not specially defined for this kind of determination. In this context, it was decided to develop an UHPLC-MS/MS method to analyze 10 illicit drugs and pharmaceuticals in wastewater treatment plant influent or effluent using a pre-concentration on-line system. A validation process was then carried out using the accuracy profile concept as an innovative tool to estimate the probability of getting prospective results within specified acceptance limits. Influent and effluent samples were spiked with known amounts of the 10 compounds and analyzed three times a day for three days in order to estimate intra-day and inter-day variations. The matrix effect was estimated for each compound. The developed method can provide at least 80% of results within ±25% limits except for compounds that are degraded in influent. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A task-based parallelism and vectorized approach to 3D Method of Characteristics (MOC) reactor simulation for high performance computing architectures

    Science.gov (United States)

    Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.

    2016-05-01

    In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.

  20. An applicable approach for performance auditing in ERP

    Directory of Open Access Journals (Sweden)

    Wan Jian Guo

    2016-01-01

    Full Text Available This paper aims at the realistic problem of performance auditing in ERP environment. Traditional performance auditing methods and existing approaches for performance evaluation of ERP implementation could not work well, because they are either difficult to work or contains certain subjective elements. This paper proposed an applicable performance auditing approach for SAP ERP based on quantitative analysis. This approach consists of 3 parts which are system utilization, data quality and the effectiveness of system control. In each part, we provide the main process to conduct the operation, especially how to calculate the online settlement rate of SAP system. This approach has played an important role in the practical auditing work. A practical case is provided at the end of this paper to describe the effectiveness of this approach. Implementation of this approach also has some significance to the performance auditing of other ERP products.

  1. Characterization of the Mechanical Stress-Strain Performance of Aerospace Alloy Materials Using Frequency-Domain Photoacoustic Ultrasound and Photothermal Methods: An FEM Approach

    Science.gov (United States)

    Huan, Huiting; Mandelis, Andreas; Liu, Lixian

    2018-04-01

    Determining and keeping track of a material's mechanical performance is very important for safety in the aerospace industry. The mechanical strength of alloy materials is precisely quantified in terms of its stress-strain relation. It has been proven that frequency-domain photothermoacoustic (FD-PTA) techniques are effective methods for characterizing the stress-strain relation of metallic alloys. PTA methodologies include photothermal (PT) diffusion and laser thermoelastic photoacoustic ultrasound (PAUS) generation which must be separately discussed because the relevant frequency ranges and signal detection principles are widely different. In this paper, a detailed theoretical analysis of the connection between thermoelastic parameters and stress/strain tensor is presented with respect to FD-PTA nondestructive testing. Based on the theoretical model, a finite element method (FEM) was further implemented to simulate the PT and PAUS signals at very different frequency ranges as an important analysis tool of experimental data. The change in the stress-strain relation has an impact on both thermal and elastic properties, verified by FEM and results/signals from both PT and PAUS experiments.

  2. Methodological approach to organizational performance improvement process

    OpenAIRE

    Buble, Marin; Dulčić, Želimir; Pavić, Ivan

    2017-01-01

    Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  3. Methodological approach to organizational performance improvement process

    Directory of Open Access Journals (Sweden)

    Marin Buble

    2001-01-01

    Full Text Available Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  4. Approach to performance based regulation development

    International Nuclear Information System (INIS)

    Spogen, L.R.; Cleland, L.L.

    1977-06-01

    An approach to the development of performance based regulations (PBR's) is described. Initially, a framework is constructed that consists of a function hierarchy and associated measures. The function at the top of the hierarchy is described in terms of societal objectives. Decomposition of this function into subordinate functions and their subsequent decompositions yield the function hierarchy. ''Bottom'' functions describe the roles of system components. When measures are identified for the performance of each function and means of aggregating performances to higher levels are established, the framework may be employed for developing PBR's. Consideration of system flexibility and performance uncertainty guide in determining the hierarchical level at which regulations are formulated. Ease of testing compliance is also a factor. To show the viability of the approach, the framework developed by Lawrence Livermore Laboratory for the Nuclear Regulatory Commission for evaluation of material control systems at fixed facilities is presented

  5. Methodological approach to strategic performance optimization

    OpenAIRE

    Hell, Marko; Vidačić, Stjepan; Garača, Željko

    2009-01-01

    This paper presents a matrix approach to the measuring and optimization of organizational strategic performance. The proposed model is based on the matrix presentation of strategic performance, which follows the theoretical notions of the balanced scorecard (BSC) and strategy map methodologies, initially developed by Kaplan and Norton. Development of a quantitative record of strategic objectives provides an arena for the application of linear programming (LP), which is a mathematical tech...

  6. Method to perform radioimmunological analyses

    International Nuclear Information System (INIS)

    Friedel, R.

    1976-01-01

    The invention concerns a method for the radioimmunoligcal detection of antigens. According to the invention, antibodies are adsorbed on water-insoluble high-polymeric compounds on the inner surfaces of a capillary device, a labelled antigen is then added and, following incubation, suching off of the test mixture and washing of the coated surfaces, the latter is measured for radioactivity. (VJ) [de

  7. Performance improvement integration: a whole systems approach.

    Science.gov (United States)

    Page, C K

    1999-02-01

    Performance improvement integration in health care organizations is a challenge for health care leaders. Required for accreditation by the Joint Commission on Accreditation of Healthcare Organizations (Joint Commission), performance improvement (PI) can be designed as a sustainable model for performance to survive in a turbulent period. Central Baptist Hospital developed a model for PI that focused on strategy established by the leadership team, delineated responsibility through the organizational structure of shared governance, and accountability for outcomes evidenced through the organization's profitability. Such an approach integrated into the culture of the organization can produce positive financial margins, positive customer satisfaction, and commendations from the Joint Commission.

  8. Intelligent automation of high-performance liquid chromatography method development by means of a real-time knowledge-based approach.

    Science.gov (United States)

    I, Ting-Po; Smith, Randy; Guhan, Sam; Taksen, Ken; Vavra, Mark; Myers, Douglas; Hearn, Milton T W

    2002-09-27

    We describe the development, attributes and capabilities of a novel type of artificial intelligence system, called LabExpert, for automation of HPLC method development. Unlike other computerised method development systems, LabExpert operates in real-time, using an artificial intelligence system and design engine to provide experimental decision outcomes relevant to the optimisation of complex separations as well as the control of the instrumentation, column selection, mobile phase choice and other experimental parameters. LabExpert manages every input parameter to a HPLC data station and evaluates each output parameter of the HPLC data station in real-time as part of its decision process. Based on a combination of inherent and user-defined evaluation criteria, the artificial intelligence system programs use a reasoning process, applying chromatographic principles and acquired experimental observations to iteratively provide a regime for a priori development of an acceptable HPLC separation method. Because remote monitoring and control are also functions of LabExpert, the system allows full-time utilisation of analytical instrumentation and associated laboratory resources. Based on our experience with LabExpert with a wide range of analyte mixtures, this artificial intelligence system consistently identified in a similar or faster time-frame preferred sets of analytical conditions that are equal in resolution, efficiency and throughput to those empirically determined by highly experienced chromatographic scientists. An illustrative example, demonstrating the potential of LabExpert in the process of method development of drug substances, is provided.

  9. A neuroanatomical approach to exploring organizational performance

    Directory of Open Access Journals (Sweden)

    Gillingwater, D.

    2009-01-01

    Full Text Available Insights gained from studying the human brain have begun to open up promising new areas of research in the behavioural and social sciences. Neuroscience-based principles have been incorporated into areas such as business management, economics and marketing, leading to the development of artificial neural networks, neuroeconomics, neuromarketing and, most recently, organizational cognitive neuroscience. Similarly, the brain has been used as a powerful metaphor for thinking about and analysing the nature of organizations. However, no existing approach to organizational analysis has taken advantage of contemporary neuroanatomical principles, thereby missing the opportunity to translate core neuroanatomical knowledge into other, non-related areas of research. In this essentially conceptual paper, we propose several ways in which neuroanatomical approaches could be used to enhance organizational theory, practice and research. We suggest that truly interdisciplinary and collaborative research between neuroanatomists and organizational analysts is likely to provide novel approaches to exploring and improving organizational performance.

  10. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  11. Performance Optimization in Sport: A Psychophysiological Approach

    Directory of Open Access Journals (Sweden)

    Selenia di Fronso

    2017-11-01

    Full Text Available ABSTRACT In the last 20 years, there was a growing interest in the study of the theoretical and applied issues surrounding psychophysiological processes underlying performance. The psychophysiological monitoring, which enables the study of these processes, consists of the assessment of the activation and functioning level of the organism using a multidimensional approach. In sport, it can be used to attain a better understanding of the processes underlying athletic performance and to improve it. The most frequently used ecological techniques include electromyography (EMG, electrocardiography (ECG, electroencephalography (EEG, and the assessment of electrodermal activity and breathing rhythm. The purpose of this paper is to offer an overview of the use of these techniques in applied interventions in sport and physical exercise and to give athletes, coaches and sport psychology experts new insights for performance improvement.

  12. Maintenance Approaches for Different Production Methods

    Directory of Open Access Journals (Sweden)

    Mungani, Dzivhuluwani Simon

    2013-11-01

    Full Text Available Various production methods are used in industry to manufacture or produce a variety of products needed by industry and consumers. The nature of a product determines which production method is most suitable or cost-effective. A continuous process is typically used to produce large volumes of liquids or gases. Batch processing is often used for small volumes, such as pharmaceutical products. This paper discusses a research project to determine the relationship between maintenance approaches and production methods. A survey was done to determine to what extent three maintenance approaches reliability-centred maintenance (RCM, total productive maintenance (TPM, and business-centred maintenance (BCM are used for three different processing methods (continuous process, batch process, and a production line method.

  13. Socratic Method as an Approach to Teaching

    Directory of Open Access Journals (Sweden)

    Haris Delić

    2016-10-01

    Full Text Available In this article we presented the theoretical view of Socrates' life and his method in teaching. After the biographical facts of Socrates and his life, we explained the method he used in teaching and the two main types of his method, Classic and Modern Socratic Method. Since the core of Socrates' approach is the dialogue as a form of teaching we explained how exactly the Socratic dialogue goes. Besides that, we presented two examples of dialogues that Socrates led, Meno and Gorgias. Socratic circle is also one of the aspects that we presented in this paper. It is the form of seminars that is crucial for group discussions of a given theme. At the end, some disadvantages of the Method are explained. With this paper, the reader can get the conception of this approach of teaching and can use Socrates as an example of how successfull teacher leads his students towards the goal.

  14. A Production Approach to Performance of Banks with Microfinance Operations

    OpenAIRE

    Emilyn Cabanda; Eleanor C. Domingo

    2014-01-01

    Banking institutions, nowadays, serve as intermediaries of funds to a variety of clients, including the micro enterprisers. This study analyzes and measures the performance of rural and thrift banks with microfinance operations in the Philippines, using combined measures of data envelopment analysis and traditional financial performance indicators. Data envelopment analysis (DEA) method is employed to measure the productive efficiency of these banks under the production approach. The variable...

  15. Differentiating Performance Approach Goals and Their Unique Effects

    Science.gov (United States)

    Edwards, Ordene V.

    2014-01-01

    The study differentiates between two types of performance approach goals (competence demonstration performance approach goal and normative performance approach goal) by examining their unique effects on self-efficacy, interest, and fear of failure. Seventy-nine students completed questionnaires that measure performance approach goals,…

  16. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  17. Proposed modifications of Environmental Protection Agency Method 1601 for detection of coliphages in drinking water, with same-day fluorescence-based detection and evaluation by the performance-based measurement system and alternative test protocol validation approaches.

    Science.gov (United States)

    Salter, Robert S; Durbin, Gregory W; Conklin, Ernestine; Rosen, Jeff; Clancy, Jennifer

    2010-12-01

    Coliphages are microbial indicators specified in the Ground Water Rule that can be used to monitor for potential fecal contamination of drinking water. The Total Coliform Rule specifies coliform and Escherichia coli indicators for municipal water quality testing; thus, coliphage indicator use is less common and advances in detection methodology are less frequent. Coliphages are viral structures and, compared to bacterial indicators, are more resistant to disinfection and diffuse further distances from pollution sources. Therefore, coliphage presence may serve as a better predictor of groundwater quality. This study describes Fast Phage, a 16- to 24-h presence/absence modification of U.S. Environmental Protection Agency (EPA) Method 1601 for detection of coliphages in 100 ml water. The objective of the study is to demonstrate that the somatic and male-specific coliphage modifications provide results equivalent to those of Method 1601. Five laboratories compared the modifications, featuring same-day fluorescence-based prediction, to Method 1601 by using the performance-based measurement system (PBMS) criterion. This requires a minimum 50% positive response in 10 replicates of 100-ml water samples at coliphage contamination levels of 1.3 to 1.5 PFU/100 ml. The laboratories showed that Fast Phage meets PBMS criteria with 83.5 to 92.1% correlation of the same-day rapid fluorescence-based prediction with the next-day result. Somatic coliphage PBMS data are compared to manufacturer development data that followed the EPA alternative test protocol (ATP) validation approach. Statistical analysis of the data sets indicates that PBMS utilizes fewer samples than does the ATP approach but with similar conclusions. Results support testing the coliphage modifications by using an EPA-approved national PBMS approach with collaboratively shared samples.

  18. Statistical learning methods: Basics, control and performance

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  19. Statistical learning methods: Basics, control and performance

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  20. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  1. Gradient High Performance Liquid Chromatography Method ...

    African Journals Online (AJOL)

    Purpose: To develop a gradient high performance liquid chromatography (HPLC) method for the simultaneous determination of phenylephrine (PHE) and ibuprofen (IBU) in solid ..... nimesulide, phenylephrine. Hydrochloride, chlorpheniramine maleate and caffeine anhydrous in pharmaceutical dosage form. Acta Pol.

  2. Approaching direct optimization of as-built lens performance

    Science.gov (United States)

    McGuire, James P.; Kuper, Thomas G.

    2012-10-01

    We describe a method approaching direct optimization of the rms wavefront error of a lens including tolerances. By including the effect of tolerances in the error function, the designer can choose to improve the as-built performance with a fixed set of tolerances and/or reduce the cost of production lenses with looser tolerances. The method relies on the speed of differential tolerance analysis and has recently become practical due to the combination of continuing increases in computer hardware speed and multiple core processing We illustrate the method's use on a Cooke triplet, a double Gauss, and two plastic mobile phone camera lenses.

  3. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid chromatography (HPLC) method for the determination of cefadroxil monohydrate in human plasma. Methods: Schimadzu HPLC with LC solution software was used with Waters Spherisorb, C18 (5 μm, 150mm × 4.5mm) column. The mobile phase ...

  4. Comparing the performance of biomedical clustering methods

    DEFF Research Database (Denmark)

    Wiwie, Christian; Baumbach, Jan; Röttger, Richard

    2015-01-01

    expression to protein domains. Performance was judged on the basis of 13 common cluster validity indices. We developed a clustering analysis platform, ClustEval (http://clusteval.mpi-inf.mpg.de), to promote streamlined evaluation, comparison and reproducibility of clustering results in the future......Identifying groups of similar objects is a popular first step in biomedical data analysis, but it is error-prone and impossible to perform manually. Many computational methods have been developed to tackle this problem. Here we assessed 13 well-known methods using 24 data sets ranging from gene....... This allowed us to objectively evaluate the performance of all tools on all data sets with up to 1,000 different parameter sets each, resulting in a total of more than 4 million calculated cluster validity indices. We observed that there was no universal best performer, but on the basis of this wide...

  5. Methods of Evaluating Performances for Marketing Strategies

    OpenAIRE

    Ioan Cucu

    2005-01-01

    There are specific methods for assessing and improving the effectiveness of a marketing strategy. A marketer should state in the marketing plan what a marketing strategy is supposed to accomplish. These statements should set forth performance standards, which usually are stated in terms of profits, sales, or costs. Actual performance must be measured in similar terms so that comparisons are possible. This paper describes sales analysis and cost analysis, two general ways of evaluating the act...

  6. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  7. Personality, Assessment Methods and Academic Performance

    Science.gov (United States)

    Furnham, Adrian; Nuygards, Sarah; Chamorro-Premuzic, Tomas

    2013-01-01

    This study examines the relationship between personality and two different academic performance (AP) assessment methods, namely exams and coursework. It aimed to examine whether the relationship between traits and AP was consistent across self-reported versus documented exam results, two different assessment techniques and across different…

  8. Validated high performance liquid chromatographic (HPLC) method ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-02-22

    Feb 22, 2010 ... specific and accurate high performance liquid chromatographic method for determination of ZER in micro-volumes ... tional medicine as a cure for swelling, sores, loss of appetite and ... Receptor Activator for Nuclear Factor κ B Ligand .... The effect of ... be suitable for preclinical pharmacokinetic studies. The.

  9. Business Intelligence Approach In A Business Performance Context

    OpenAIRE

    Muntean, Mihaela; Cabau, Liviu Gabriel

    2011-01-01

    Subordinated to performance management, Business Intelligence approaches help firms to optimize business performance. Key performance indicators will be added to the multidimensional model grounding the performance perspectives. With respect to the Business Intelligence value chain, a theoretical approach was introduced and a practice example, based on Microsoft SQL Server specific services, for the customer perspective was implemented.

  10. Accounting Student's Learning Approaches And Impact On Academic Performance

    OpenAIRE

    Ismail, Suhaiza

    2009-01-01

    The objective of the study is threefold. Firstly, the study explores the learning approaches adopted by students in completing their Business Finance. Secondly, it examines the impact that learning approaches has on the student's academic performance. Finally, the study considers gender differences in the learning approaches adopted by students and in the relationship between learning approaches and academic performance. The Approaches and Study Skills Inventory for Students (ASSIST) was used...

  11. Using hybrid method to evaluate the green performance in uncertainty.

    Science.gov (United States)

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed.

  12. A multiparameter chaos control method based on OGY approach

    International Nuclear Information System (INIS)

    Souza de Paula, Aline; Amorim Savi, Marcelo

    2009-01-01

    Chaos control is based on the richness of responses of chaotic behavior and may be understood as the use of tiny perturbations for the stabilization of a UPO embedded in a chaotic attractor. Since one of these UPO can provide better performance than others in a particular situation the use of chaos control can make this kind of behavior to be desirable in a variety of applications. The OGY method is a discrete technique that considers small perturbations promoted in the neighborhood of the desired orbit when the trajectory crosses a specific surface, such as a Poincare section. This contribution proposes a multiparameter semi-continuous method based on OGY approach in order to control chaotic behavior. Two different approaches are possible with this method: coupled approach, where all control parameters influences system dynamics although they are not active; and uncoupled approach that is a particular case where control parameters return to the reference value when they become passive parameters. As an application of the general formulation, it is investigated a two-parameter actuation of a nonlinear pendulum control employing coupled and uncoupled approaches. Analyses are carried out considering signals that are generated by numerical integration of the mathematical model using experimentally identified parameters. Results show that the procedure can be a good alternative for chaos control since it provides a more effective UPO stabilization than the classical single-parameter approach.

  13. A hybrid approach for efficient anomaly detection using metaheuristic methods

    Directory of Open Access Journals (Sweden)

    Tamer F. Ghanem

    2015-07-01

    Full Text Available Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  14. Performance analysis, quality function deployment and structured methods

    Science.gov (United States)

    Maier, M. W.

    Quality function deployment, (QFD), an approach to synthesizing several elements of system modeling and design into a single unit, is presented. Behavioral, physical, and performance modeling are usually considered as separate aspects of system design without explicit linkages. Structured methodologies have developed linkages between behavioral and physical models before, but have not considered the integration of performance models. QFD integrates performance models with traditional structured models. In this method, performance requirements such as cost, weight, and detection range are partitioned into matrices. Partitioning is done by developing a performance model, preferably quantitative, for each requirement. The parameters of the model become the engineering objectives in a QFD analysis and the models are embedded in a spreadsheet version of the traditional QFD matrices. The performance model and its parameters are used to derive part of the functional model by recognizing that a given performance model implies some structure to the functionality of the system.

  15. Approaching Sentient Building Performance Simulation Systems

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer; Perkov, Thomas; Heller, Alfred

    2014-01-01

    Sentient BPS systems can combine one or more high precision BPS and provide near instantaneous performance feedback directly in the design tool, thus providing speed and precision of building performance in the early design stages. Sentient BPS systems are essentially combining: 1) design tools, 2......) parametric tools, 3) BPS tools, 4) dynamic databases 5) interpolation techniques and 6) prediction techniques as a fast and valid simulation system, in the early design stage....

  16. A statistical approach to nuclear fuel design and performance

    Science.gov (United States)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance

  17. Human performance assessment: methods and measures

    International Nuclear Information System (INIS)

    Andresen, Gisle; Droeivoldsmo, Asgeir

    2000-10-01

    The Human Error Analysis Project (HEAP) was initiated in 1994. The aim of the project was to acquire insights on how and why cognitive errors occur when operators are engaged in problem solving in advanced integrated control rooms. Since human error had not been studied in the HAlden Man-Machine LABoratory (HAMMLAB) before, it was also necessary to carry out research in methodology. In retrospect, it is clear that much of the methodological work is relevant to human-machine research in general, and not only to research on human error. The purpose of this report is, therefore, to give practitioners and researchers an overview of the methodological parts of HEAP. The scope of the report is limited to methods used throughout the data acquisition process, i.e., data-collection methods, data-refinement methods, and measurement methods. The data-collection methods include various types of verbal protocols, simulator logs, questionnaires, and interviews. Data-refinement methods involve different applications of the Eyecon system, a flexible data-refinement tool, and small computer programs used for rearranging, reformatting, and aggregating raw-data. Measurement methods involve assessment of diagnostic behaviour, erroneous actions, complexity, task/system performance, situation awareness, and workload. The report concludes that the data-collection methods are generally both reliable and efficient. The data-refinement methods, however, should be easier to use in order to facilitate explorative analyses. Although the series of experiments provided an opportunity for measurement validation, there are still uncertainties connected to several measures, due to their reliability still being unknown. (Author). 58 refs.,7 tabs

  18. Efficacy of a Template Creation Approach for Performance Improvement

    Science.gov (United States)

    Lyons, Paul R.

    2011-01-01

    This article presents the training and performance improvement approach, performance templates (P-T), and provides empirical evidence to support the efficacy of P-T. This approach involves a partnership among managers, trainers, and employees in the creation, use, and improvement of guides to affect the performance of critical tasks in the…

  19. Fuzzy Logic Approach to Diagnosis of Feedwater Heater Performance Degradation

    International Nuclear Information System (INIS)

    Kang, Yeon Kwan; Kim, Hyeon Min; Heo, Gyun Young; Sang, Seok Yoon

    2014-01-01

    Since failure in, damage to, and performance degradation of power generation components in operation under harsh environment of high pressure and high temperature may cause both economic and human loss at power plants, highly reliable operation and control of these components are necessary. Therefore, a systematic method of diagnosing the condition of these components in its early stages is required. There have been many researches related to the diagnosis of these components, but our group developed an approach using a regression model and diagnosis table, specializing in diagnosis relating to thermal efficiency degradation of power plant. However, there was a difficulty in applying the method using the regression model to power plants with different operating conditions because the model was sensitive to value. In case of the method that uses diagnosis table, it was difficult to find the level at which each performance degradation factor had an effect on the components. Therefore, fuzzy logic was introduced in order to diagnose performance degradation using both qualitative and quantitative results obtained from the components' operation data. The model makes performance degradation assessment using various performance degradation variables according to the input rule constructed based on fuzzy logic. The purpose of the model is to help the operator diagnose performance degradation of components of power plants. This paper makes an analysis of power plant feedwater heater by using fuzzy logic. Feedwater heater is one of the core components that regulate life-cycle of a power plant. Performance degradation has a direct effect on power generation efficiency. It is not easy to observe performance degradation of feedwater heater. However, on the other hand, troubles such as tube leakage may bring simultaneous damage to the tube bundle and therefore it is the object of concern in economic aspect. This study explains the process of diagnosing and verifying typical

  20. Fuzzy Logic Approach to Diagnosis of Feedwater Heater Performance Degradation

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Yeon Kwan; Kim, Hyeon Min; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Sang, Seok Yoon [Engineering and Technical Center, Korea Hydro, Daejeon (Korea, Republic of)

    2014-08-15

    Since failure in, damage to, and performance degradation of power generation components in operation under harsh environment of high pressure and high temperature may cause both economic and human loss at power plants, highly reliable operation and control of these components are necessary. Therefore, a systematic method of diagnosing the condition of these components in its early stages is required. There have been many researches related to the diagnosis of these components, but our group developed an approach using a regression model and diagnosis table, specializing in diagnosis relating to thermal efficiency degradation of power plant. However, there was a difficulty in applying the method using the regression model to power plants with different operating conditions because the model was sensitive to value. In case of the method that uses diagnosis table, it was difficult to find the level at which each performance degradation factor had an effect on the components. Therefore, fuzzy logic was introduced in order to diagnose performance degradation using both qualitative and quantitative results obtained from the components' operation data. The model makes performance degradation assessment using various performance degradation variables according to the input rule constructed based on fuzzy logic. The purpose of the model is to help the operator diagnose performance degradation of components of power plants. This paper makes an analysis of power plant feedwater heater by using fuzzy logic. Feedwater heater is one of the core components that regulate life-cycle of a power plant. Performance degradation has a direct effect on power generation efficiency. It is not easy to observe performance degradation of feedwater heater. However, on the other hand, troubles such as tube leakage may bring simultaneous damage to the tube bundle and therefore it is the object of concern in economic aspect. This study explains the process of diagnosing and verifying typical

  1. A new approach in performing microdiffraction analysis

    International Nuclear Information System (INIS)

    Winter, D.J.; Squires, B.A.

    1995-01-01

    Microdiffraction is defined as the x-ray diffraction analysis performed on small samples or MD areas of large samples. Since smallness is a relative term, microdiffraction is considered the technique of choice when samples are too small for the optics and precision of conventional instrumentation. The limit on the size of the sample is dependent upon the accuracy of the instrumentation, which is measured by such variables as the diameter of the incident beam and the sphere of confusion of the goniometer (accuracy of the circle centers). If the sample area of interest is part of a multiphase material, it is necessary for the diameter of the incident x-ray beam to be smaller than the sample area in order to assure that the diffraction pattern produced is from the sample area of interest only. Today, microdiffraction is being performed on samples as small as a few microns in diameter. Common applications for microdiffraction include composite materials such as wafers and pads used in the semiconductor industry, inclusions on laser disks and forensic studies. The analysis is often complicated by the fact that the sample areas can be a few grains or even a single crystal. Conventional powder diffractometers are very well suited for analyzing large volumes of polycrystalline material, however, they require much longer counting times when the sample volume is very small. Ideally, what is needed is the optics of a single crystal diffractometer with the performance of a conventional powder diffractometer. 6 figs

  2. EVALUATION OF WOOD PERFORMANCE IN BUILDING CONSTRUCTION THROUGH SYSTEM APPROACH

    Directory of Open Access Journals (Sweden)

    Ricardo Pedreschi

    2005-09-01

    Full Text Available Building construction is considered to be the leading market for the wood industry, in developed and developingcountries. The greatest amount of wood produced in Brazil is consumed as firewood and energy, followed by production of celluloseand third as machined wood. The use of wood from planted forests can be increased. This would lead to a better use of naturalresources, and consequently to an increased sustainability of forest activity in many regions of the country. The performance of woodcan be observed from many different insights: symbolic performance, technical performance and economical performance, conductedby the method of systems approach to architecture. Usages of wood related to the performances of the material, with the redefinitionof parameters of use, elaborating a new culture linked to new technologies were outlined. This work diagnosed the usage of wood inbuilding construction based in system analysis. Through an opinion research related to the acceptation of the use of wood we observethe possibilities of utilization according to physical and mechanical proprieties, aesthetics and appearance performance and postoccupation.According to the results obtained related to the culture and knowledge about the use of wood from planted forest, it canconclude that there is not enough knowledge in this area, and it is, therefore, necessary to create an information system forprofessionals and for people in general.

  3. Optofluidic Approaches for Enhanced Microsensor Performances

    Directory of Open Access Journals (Sweden)

    Genni Testa

    2014-12-01

    Full Text Available Optofluidics is a relatively young research field able to create a tight synergy between optics and micro/nano-fluidics. The high level of integration between fluidic and optical elements achievable by means of optofluidic approaches makes it possible to realize an innovative class of sensors, which have been demonstrated to have an improved sensitivity, adaptability and compactness. Many developments in this field have been made in the last years thanks to the availability of a new class of low cost materials and new technologies. This review describes the Italian state of art on optofluidic devices for sensing applications and offers a perspective for further future advances. We introduce the optofluidic concept and describe the advantages of merging photonic and fluidic elements, focusing on sensor developments for both environmental and biomedical monitoring.

  4. An exergy method for compressor performance analysis

    Energy Technology Data Exchange (ETDEWEB)

    McGovern, J A; Harte, S [Trinity Coll., Dublin (Ireland)

    1995-07-01

    An exergy method for compressor performance analysis is presented. The purpose of this is to identify and quantify defects in the use of a compressor`s shaft power. This information can be used as the basis for compressor design improvements. The defects are attributed to friction, irreversible heat transfer, fluid throttling, and irreversible fluid mixing. They are described, on a common basis, as exergy destruction rates and their locations are identified. The method can be used with any type of positive displacement compressor. It is most readily applied where a detailed computer simulation program is available for the compressor. An analysis of an open reciprocating refrigeration compressor that used R12 refrigerant is given as an example. The results that are presented consist of graphs of the instantaneous rates of exergy destruction according to the mechanisms involved, a pie chart of the breakdown of the average shaft power wastage by mechanism, and a pie chart with a breakdown by location. (author)

  5. Shaping the manufacturing industry performance: MIDAS approach

    International Nuclear Information System (INIS)

    Turhan, Ibrahim M.; Sensoy, Ahmet; Hacihasanoglu, Erk

    2015-01-01

    We aim to find out whether the exchange rate (against US dollar) or the interest rate (in local currency) is a better variable in predicting the capacity utilization rate of manufacturing industry (CUR) of Turkey after the 2008 global financial crisis. In that manner, we implement dynamic mixed data sampling (MIDAS) regression model to forecast monthly changes in CUR by using daily changes in the exchange rate and the interest rate separately. The results show that exchange rate has a better forecast performance suggesting that it is a stronger determinant in shaping the manufacturing industry

  6. Enhanced Portfolio Performance Using a Momentum Approach to Annual Rebalancing

    OpenAIRE

    Michael D. Mattei

    2018-01-01

    After diversification, periodic portfolio rebalancing has become one of the most widely practiced methods for reducing portfolio risk and enhancing returns. Most of the rebalancing strategies found in the literature are generally regarded as contrarian approaches to rebalancing. A recent article proposed a rebalancing approach that incorporates a momentum approach to rebalancing. The momentum approach had a better risk adjusted return than either the traditional approach or a Buy-and-Hold app...

  7. Academic Performance: An Approach From Data Mining

    Directory of Open Access Journals (Sweden)

    David L. La Red Martinez

    2012-02-01

    Full Text Available The relatively low% of students promoted and regularized in Operating Systems Course of the LSI (Bachelor’s Degree in Information Systems of FaCENA (Faculty of Sciences and Natural Surveying - Facultad de Ciencias Exactas, Naturales y Agrimensura of UNNE (academic success, prompted this work, whose objective is to determine the variables that affect the academic performance, whereas the final status of the student according to the Res. 185/03 CD (scheme for evaluation and promotion: promoted, regular or free1. The variables considered are: status of the student, educational level of parents, secondary education, socio-economic level, and others. Data warehouse (Data Warehouses: DW and data mining (Data Mining: DM techniques were used to search pro.les of students and determine success or failure academic potential situations. Classifications through techniques of clustering according to different criteria have become. Some criteria were the following: mining of classification according to academic program, according to final status of the student, according to importance given to the study, mining of demographic clustering and Kohonen clustering according to final status of the student. Were conducted statistics of partition, detail of partitions, details of clusters, detail of fields and frequency of fields, overall quality of each process and quality detailed (precision, classification, reliability, arrays of confusion, diagrams of gain / elevation, trees, distribution of nodes, of importance of fields, correspondence tables of fields and statistics of cluster. Once certain profiles of students with low academic performance, it may address actions aimed at avoiding potential academic failures. This work aims to provide a brief description of aspects related to the data warehouse built and some processes of data mining developed on the same.

  8. Performance of non-conventional factorization approaches for neutron kinetics

    International Nuclear Information System (INIS)

    Bulla, S.; Nervo, M.

    2013-01-01

    The use of factorization techniques provides a interesting option for the simulation of the time-dependent behavior of nuclear systems with a reduced computational effort. While point kinetics neglects all spatial and spectral effects, quasi-statics and multipoint kinetics allow to produce results with a higher accuracy for transients involving relevant modifications of the neutron distribution. However, in some conditions these methods can not work efficiently. In this paper, we discuss some possible alternative formulations for the factorization process for neutron kinetics, leading to mathematical models of reduced complications that can allow an accurate simulation of transients involving spatial and spectral effects. The performance of these innovative approaches are compared to standard techniques for some test cases, showing the benefits and shortcomings of the method proposed. (authors)

  9. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  10. Compression-RSA: New approach of encryption and decryption method

    Science.gov (United States)

    Hung, Chang Ee; Mandangan, Arif

    2013-04-01

    Rivest-Shamir-Adleman (RSA) cryptosystem is a well known asymmetric cryptosystem and it has been applied in a very wide area. Many researches with different approaches have been carried out in order to improve the security and performance of RSA cryptosystem. The enhancement of the performance of RSA cryptosystem is our main interest. In this paper, we propose a new method to increase the efficiency of RSA by shortening the number of plaintext before it goes under encryption process without affecting the original content of the plaintext. Concept of simple Continued Fraction and the new special relationship between it and Euclidean Algorithm have been applied on this newly proposed method. By reducing the number of plaintext-ciphertext, the encryption-decryption processes of a secret message can be accelerated.

  11. Qualitative Approaches to Mixed Methods Practice

    Science.gov (United States)

    Hesse-Biber, Sharlene

    2010-01-01

    This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced…

  12. Performance assessment plans and methods for the Salt Repository Project

    International Nuclear Information System (INIS)

    1984-08-01

    This document presents the preliminary plans and anticipated methods of the Salt Repository Project (SRP) for assessing the postclosure and radiological aspects of preclosure performance of a nuclear waste repository in salt. This plan is intended to be revised on an annual basis. The emphasis in this preliminary effort is on the method of conceptually dividing the system into three subsystems (the very near field, the near field, and the far field) and applying models to analyze the behavior of each subsystem and its individual components. The next revision will contain more detailed plans being developed as part of Site Characterization Plan (SCP) activities. After a brief system description, this plan presents the performance targets which have been established for nuclear waste repositories by regulatory agencies (Chapter 3). The SRP approach to modeling, including sensitivity and uncertainty techniques is then presented (Chapter 4). This is followed by a discussion of scenario analysis (Chapter 5), a presentation of preliminary data needs as anticipated by the SRP (Chapter 6), and a presentation of the SRP approach to postclosure assessment of the very near field, the near field, and the far field (Chapters 7, 8, and 9, respectively). Preclosure radiological assessment is discussed in Chapter 10. Chapter 11 presents the SRP approach to code verification and validation. Finally, the Appendix lists all computer codes anticipated for use in performance assessments. The list of codes will be updated as plans are revised

  13. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  14. Mixed method approaches to evaluate conservation impact

    DEFF Research Database (Denmark)

    Lund, Jens Friis; Burgess, Neil D.; Chamshama, Shabani A.O.

    2015-01-01

    Nearly 10% of the world's total forest area is formally owned by communities and indigenous groups, yet knowledge of the effects of decentralized forest management approaches on conservation (and livelihood) impacts remains elusive. In this paper, the conservation impact of decentralized forest m...

  15. Energy performance of three Airtight Drywall Approach houses

    Energy Technology Data Exchange (ETDEWEB)

    Howell, D.G.; Mayhew, W.J.

    1987-03-01

    The objective of this study was to assess a new constructon technique, the Airtight Drywall Approach (ADA), as it was implemented in three test houses, and to compare the performance of these houses against three control houses typical of residential construction techniques in Alberta. The study focussed on four aspects of house performance integrity of the air barrier system, energy conservation. ventilation, and indoor air quality, and the development and demonstration of computer-based field monitoring techniques. Data were gathered through continuous computer-based measurements, regular site visits, manual measurements, homeowner interviews, and special site tests. The results of the air-leakage tests indicated that ADA is an effective method of reducing air infiltration in homes. The floor joist sealing technique used in the ADA houses was observed to deteriorate within a year of construction. It is no longer recommended. The monitoring results showed a significant reduction in energy consumption in the homes with energy conservation features. Measurements of air-borne contaminants indicated that the ADA test homes performed similar to other energy-efficient homes monitored across Canada and that pollutant levels were within accepted guidelines. 6 refs., 6 figs., 14 tabs.

  16. Approaches and methods of risk assessment

    International Nuclear Information System (INIS)

    Rowe, W.D.

    1983-01-01

    The classification system of risk assessment includes the categories: 1) risk comparisons, 2) cost-effectiveness of risk reduction, 3) balancing of costs, risks and benefits against one another, 4. Metasystems. An overview of methods and systems reveals that no single method can be applied to all cases and situations. The visibility of the process and the absolute consideration of all aspects of judging are, however, of first and fore most importance. (DG) [de

  17. DESIGNING COMPANY PERFORMANCE MEASUREMENT SYSTEM USING BALANCE SCORECARD APPROACH

    Directory of Open Access Journals (Sweden)

    Cecep Mukti Soleh

    2015-05-01

    Full Text Available This research aimed to design how to measure company performance by using balance scorecard approach in coal transportation services industry. Depth interview was used to obtain qualitative data determination of strategic objectives, key performance indicators, strategic initiatives, and in charge units for each balanced scorecard perspectives while the quantitative data were obtained from weighting through questionnaires and analyzed using paired comparison to get a perspective what mostly affected the performance of the company. To measure the achievement of corporate performance, each KPI used (1 the scoring system with the methods that higher is better, lower is better and precise is better; (2 traffic light system with the help of green, yellow, red for identification of target achievement. This research result shows that in the balance scorecard perspective, the most influences on the overall performance of the company include the customer's perspective (31%, financial perspective (29%, internal business processes (21%, learning, and growth 19%. Keywords: balance scorecard, paired comparison, coal transportation serviceABSTRAKPenelitian ini bertujuan untuk merancang pengukuran kinerja perusahaan dengan menggunakan pendekatan balance scorecard di industri jasa pengangkutan batu bara. Data kualitatif diperoleh melalui indepth interview digunakan untuk menentukan sasaran strategik, indikator kinerja utama, inisiatif strategi dan penanggungjawab setiap divisi setiap perspektif balance scorecard, sedangkan data kuantitatif digunakan untuk pembobotan melalui kuesioner dan dianalisis dengan menggunakan metode paired comparisson untuk mendapatkan perspektif yang paling berpengaruh terhadap kinerja perusahaan. Ukuran pencapaian kinerja perusahaan dari setiap KPI menggunakan; (1 scoring system dengan bantuan metode higher is better, lower is better dan precise is better;(2 traffic light system dengan menggunakan bantuan warna hijau, kuning, merah

  18. Performing Systematic Literature Reviews with Novices: An Iterative Approach

    Science.gov (United States)

    Lavallée, Mathieu; Robillard, Pierre-N.; Mirsalari, Reza

    2014-01-01

    Reviewers performing systematic literature reviews require understanding of the review process and of the knowledge domain. This paper presents an iterative approach for conducting systematic literature reviews that addresses the problems faced by reviewers who are novices in one or both levels of understanding. This approach is derived from…

  19. Telerobotic system performance measurement - Motivation and methods

    Science.gov (United States)

    Kondraske, George V.; Khoury, George J.

    1992-01-01

    A systems performance-based strategy for modeling and conducting experiments relevant to the design and performance characterization of telerobotic systems is described. A developmental testbed consisting of a distributed telerobotics network and initial efforts to implement the strategy described is presented. Consideration is given to the general systems performance theory (GSPT) to tackle human performance problems as a basis for: measurement of overall telerobotic system (TRS) performance; task decomposition; development of a generic TRS model; and the characterization of performance of subsystems comprising the generic model. GSPT employs a resource construct to model performance and resource economic principles to govern the interface of systems to tasks. It provides a comprehensive modeling/measurement strategy applicable to complex systems including both human and artificial components. Application is presented within the framework of a distributed telerobotics network as a testbed. Insight into the design of test protocols which elicit application-independent data is described.

  20. A CRITICAL APPROACH OF CSR RATING METHODS

    OpenAIRE

    Rãzvan Cãtãlin DOBREA; Felicia Alina DINU

    2011-01-01

    Under the contemporary business environment characterized by permanent transformation, a proper attitude towards the society and environment is essential for long-term development of organizations. The way that the company synchronize their values and behaviour with the needs and requirements of shareholders, employees, suppliers, community, authorities and society overall are reflected in their performance. Balancing all these interests, the company ability to respond to all expectations and...

  1. Bionic Design Methods - A practical approach

    DEFF Research Database (Denmark)

    Kepler, Jørgen Asbøll; Stokholm, Marianne Denise J.

    2004-01-01

    Nature has served as inspiration for product design throughout history. Applications range from poetic translations of form to utilization of primary functional principles. This paper describes a generally applicable design methodology for transforming natural functional principles to feasible...... product design. From a formulation of design demands, which need not necessarily be very precise, the approach continues with a study of natural objects (anaimals, plants) which are subject to the same demands. From this study, the working principle(s) are derived. This (these) are then clarified through...... illustrative models, which should be simplified as much as possible. The simplified principle may now be evaluated and transformed into practical design. The methodology is clarified through examples, taken from a series of extended workshops held atAalborg University ....

  2. Enterprise Engineering Method supporting Six Sigma Approach

    OpenAIRE

    Jochem, Roland

    2007-01-01

    Enterprise Modeling (EM) is currently in operation either as a technique to represent and understand the structure and behavior of the enterprise, or as a technique to analyze business processes, and in many cases as support technique for business process reengineering. However, EM architectures and methodes for Enterprise Engineering can also used to support new management techniques like SIX SIGMA, because these new techniques need a clear, transparent and integrated definition and descript...

  3. Unbiased Scanning Method and Data Banking Approach Using Ultra-High Performance Liquid Chromatography Coupled with High-Resolution Mass Spectrometry for Quantitative Comparison of Metabolite Exposure in Plasma across Species Analyzed at Different Dates.

    Science.gov (United States)

    Gao, Hongying; Deng, Shibing; Obach, R Scott

    2015-12-01

    An unbiased scanning methodology using ultra high-performance liquid chromatography coupled with high-resolution mass spectrometry was used to bank data and plasma samples for comparing the data generated at different dates. This method was applied to bank the data generated earlier in animal samples and then to compare the exposure to metabolites in animal versus human for safety assessment. With neither authentic standards nor prior knowledge of the identities and structures of metabolites, full scans for precursor ions and all ion fragments (AIF) were employed with a generic gradient LC method to analyze plasma samples at positive and negative polarity, respectively. In a total of 22 tested drugs and metabolites, 21 analytes were detected using this unbiased scanning method except that naproxen was not detected due to low sensitivity at negative polarity and interference at positive polarity; and 4'- or 5-hydroxy diclofenac was not separated by a generic UPLC method. Statistical analysis of the peak area ratios of the analytes versus the internal standard in five repetitive analyses over approximately 1 year demonstrated that the analysis variation was significantly different from sample instability. The confidence limits for comparing the exposure using peak area ratio of metabolites in animal plasma versus human plasma measured over approximately 1 year apart were comparable to the analysis undertaken side by side on the same days. These statistical analysis results showed it was feasible to compare data generated at different dates with neither authentic standards nor prior knowledge of the analytes.

  4. A Control Variate Method for Probabilistic Performance Assessment. Improved Estimates for Mean Performance Quantities of Interest

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, Robert J.; Kuhlman, Kristopher L

    2016-05-01

    We present a method of control variates for calculating improved estimates for mean performance quantities of interest, E(PQI) , computed from Monte Carlo probabilistic simulations. An example of a PQI is the concentration of a contaminant at a particular location in a problem domain computed from simulations of transport in porous media. To simplify the presentation, the method is described in the setting of a one- dimensional elliptical model problem involving a single uncertain parameter represented by a probability distribution. The approach can be easily implemented for more complex problems involving multiple uncertain parameters and in particular for application to probabilistic performance assessment of deep geologic nuclear waste repository systems. Numerical results indicate the method can produce estimates of E(PQI)having superior accuracy on coarser meshes and reduce the required number of simulations needed to achieve an acceptable estimate.

  5. TRANSFER PRICES: MECHANISMS, METHODS AND INTERNATIONAL APPROACHES

    Directory of Open Access Journals (Sweden)

    Pop Cosmina

    2008-05-01

    Full Text Available Transfer prices are considered the prices paid for the goods or services in a cross-border transaction between affiliates companies, often significant reduced or increased in order to avoid the higher imposing rates from one jurisdiction. Presently, over 60% of cross-border transfers are represented by intra-group transfers. The paper presents the variety of methods and mechanisms used by the companies to transfer the funds from one tax jurisdiction to another in order to avoid over taxation.

  6. Microscopic approach to the generator coordinate method

    International Nuclear Information System (INIS)

    Haider, Q.; Gogny, D.; Weiss, M.S.

    1989-01-01

    In this paper, we solve different theoretical problems associated with the calculation of the kernel occurring in the Hill-Wheeler integral equations within the framework of generator coordinate method. In particular, we extend the Wick's theorem to nonorthogonal Bogoliubov states. Expressions for the overlap between Bogoliubov states and for the generalized density matrix are also derived. These expressions are valid even when using an incomplete basis, as in the case of actual calculations. Finally, the Hill-Wheeler formalism is developed for a finite range interaction and the Skyrme force, and evaluated for the latter. 20 refs., 1 fig., 4 tabs

  7. Performance Benchmarking of Fast Multipole Methods

    KAUST Repository

    Al-Harthi, Noha A.

    2013-06-01

    The current trends in computer architecture are shifting towards smaller byte/flop ratios, while available parallelism is increasing at all levels of granularity – vector length, core count, and MPI process. Intel’s Xeon Phi coprocessor, NVIDIA’s Kepler GPU, and IBM’s BlueGene/Q all have a Byte/flop ratio close to 0.2, which makes it very difficult for most algorithms to extract a high percentage of the theoretical peak flop/s from these architectures. Popular algorithms in scientific computing such as FFT are continuously evolving to keep up with this trend in hardware. In the meantime it is also necessary to invest in novel algorithms that are more suitable for computer architectures of the future. The fast multipole method (FMM) was originally developed as a fast algorithm for ap- proximating the N-body interactions that appear in astrophysics, molecular dynamics, and vortex based fluid dynamics simulations. The FMM possesses have a unique combination of being an efficient O(N) algorithm, while having an operational intensity that is higher than a matrix-matrix multiplication. In fact, the FMM can reduce the requirement of Byte/flop to around 0.01, which means that it will remain compute bound until 2020 even if the cur- rent trend in microprocessors continues. Despite these advantages, there have not been any benchmarks of FMM codes on modern architectures such as Xeon Phi, Kepler, and Blue- Gene/Q. This study aims to provide a comprehensive benchmark of a state of the art FMM code “exaFMM” on the latest architectures, in hopes of providing a useful reference for deciding when the FMM will become useful as the computational engine in a given application code. It may also serve as a warning to certain problem size domains areas where the FMM will exhibit insignificant performance improvements. Such issues depend strongly on the asymptotic constants rather than the asymptotics themselves, and therefore are strongly implementation and hardware

  8. Linear Motion Systems. A Modular Approach for Improved Straightness Performance

    NARCIS (Netherlands)

    Nijsse, G.J.P.

    2001-01-01

    This thesis deals with straight motion systems. A modular approach has been applied in order to find ways to improve the performance. The main performance parameters that are considered are position accuracy, repeatability and, to a lesser extent, cost. Because of the increasing requirements to

  9. Approaches to Mixed Methods Dissemination and Implementation Research: Methods, Strengths, Caveats, and Opportunities.

    Science.gov (United States)

    Green, Carla A; Duan, Naihua; Gibbons, Robert D; Hoagwood, Kimberly E; Palinkas, Lawrence A; Wisdom, Jennifer P

    2015-09-01

    Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings.

  10. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    2016-09-03

    Sep 3, 2016 ... Sequential mixed methods research is an effective approach for ... show the effectiveness of the research method. ... qualitative data before quantitative datasets ..... whereby both types of data are collected simultaneously.

  11. Nutrition and culture in professional football. A mixed method approach.

    Science.gov (United States)

    Ono, Mutsumi; Kennedy, Eileen; Reeves, Sue; Cronin, Linda

    2012-02-01

    An adequate diet is essential for the optimal performance of professional football (soccer) players. Existing studies have shown that players fail to consume such a diet, without interrogating the reasons for this. The aim of this study was to explore the difficulties professional football players experience in consuming a diet for optimal performance. It utilized a mixed method approach, combining nutritional intake assessment with qualitative interviews, to ascertain both what was consumed and the wider cultural factors that affect consumption. The study found a high variability in individual intake which ranged widely from 2648 to 4606 kcal/day. In addition, the intake of carbohydrate was significantly lower than that recommended. The study revealed that the main food choices for carbohydrate and protein intake were pasta and chicken respectively. Interview results showed the importance of tradition within the world of professional football in structuring the players' approach to nutrition. In addition, the players' personal eating habits that derived from their class and national habitus restricted their food choice by conflicting with the dietary choices promoted within the professional football clubs. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Performance Evaluation Methods for Assistive Robotic Technology

    Science.gov (United States)

    Tsui, Katherine M.; Feil-Seifer, David J.; Matarić, Maja J.; Yanco, Holly A.

    Robots have been developed for several assistive technology domains, including intervention for Autism Spectrum Disorders, eldercare, and post-stroke rehabilitation. Assistive robots have also been used to promote independent living through the use of devices such as intelligent wheelchairs, assistive robotic arms, and external limb prostheses. Work in the broad field of assistive robotic technology can be divided into two major research phases: technology development, in which new devices, software, and interfaces are created; and clinical, in which assistive technology is applied to a given end-user population. Moving from technology development towards clinical applications is a significant challenge. Developing performance metrics for assistive robots poses a related set of challenges. In this paper, we survey several areas of assistive robotic technology in order to derive and demonstrate domain-specific means for evaluating the performance of such systems. We also present two case studies of applied performance measures and a discussion regarding the ubiquity of functional performance measures across the sampled domains. Finally, we present guidelines for incorporating human performance metrics into end-user evaluations of assistive robotic technologies.

  13. Systems engineering approach towards performance monitoring of emergency diesel generator

    International Nuclear Information System (INIS)

    Nurhayati Ramli; Lee, Y.K.

    2013-01-01

    Full-text: Systems engineering is an interdisciplinary approach and means to enable the realization of successful systems. In this study, systems engineering approach towards the performance monitoring of Emergency Diesel Generator (EDG) is presented. Performance monitoring is part and parcel of predictive maintenance where the systems and components conditions can be detected before they result into failures. In an effort to identify the proposal for addressing performance monitoring, the EDG boundary has been defined. Based on the Probabilistic Safety Analysis (PSA) results and industry operating experiences, the most critical component is identified. This paper proposed a systems engineering concept development framework towards EDG performance monitoring. The expected output of this study is that the EDG reliability can be improved by the performance monitoring alternatives through the systems engineering concept development effort. (author)

  14. Evaluating firms' R&D performance using best worst method.

    Science.gov (United States)

    Salimi, Negin; Rezaei, Jafar

    2018-02-01

    Since research and development (R&D) is the most critical determinant of the productivity, growth and competitive advantage of firms, measuring R&D performance has become the core of attention of R&D managers, and an extensive body of literature has examined and identified different R&D measurements and determinants of R&D performance. However, measuring R&D performance and assigning the same level of importance to different R&D measures, which is the common approach in existing studies, can oversimplify the R&D measuring process, which may result in misinterpretation of the performance and consequently fallacy R&D strategies. The aim of this study is to measure R&D performance taking into account the different levels of importance of R&D measures, using a multi-criteria decision-making method called Best Worst Method (BWM) to identify the weights (importance) of R&D measures and measure the R&D performance of 50 high-tech SMEs in the Netherlands using the data gathered in a survey among SMEs and from R&D experts. The results show how assigning different weights to different R&D measures (in contrast to simple mean) results in a different ranking of the firms and allow R&D managers to formulate more effective strategies to improve their firm's R&D performance by applying knowledge regarding the importance of different R&D measures. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Approaches for University Students and their Relationship to Academic Performance

    Directory of Open Access Journals (Sweden)

    Evelyn Fernández-Castillo

    2015-05-01

    Full Text Available The way students perceive learning is influenced by multiple factors. The present study aimed at establishing relationships between the learning approaches, academic performance, and the academic year in a sample of students from different courses of Universidad Central  “Marta Abreu”, Las Villas. For this ex post facto study, a probabilistic sample was used based on a simple random sampling of 524 university students who participated in the Study Process Questionnaire.  The analysis of variance (MANOVA and ANOVA and the analysis of clusters reported associations between a deep approach to learning and a better academic performance.  These analyses showed differences in the learning approach in the different courses, predominantly a soft approach.

  16. Performance Assessment Method for a Forged Fingerprint Detection Algorithm

    Science.gov (United States)

    Shin, Yong Nyuo; Jun, In-Kyung; Kim, Hyun; Shin, Woochang

    The threat of invasion of privacy and of the illegal appropriation of information both increase with the expansion of the biometrics service environment to open systems. However, while certificates or smart cards can easily be cancelled and reissued if found to be missing, there is no way to recover the unique biometric information of an individual following a security breach. With the recognition that this threat factor may disrupt the large-scale civil service operations approaching implementation, such as electronic ID cards and e-Government systems, many agencies and vendors around the world continue to develop forged fingerprint detection technology, but no objective performance assessment method has, to date, been reported. Therefore, in this paper, we propose a methodology designed to evaluate the objective performance of the forged fingerprint detection technology that is currently attracting a great deal of attention.

  17. Towards Multi-Method Research Approach in Empirical Software Engineering

    Science.gov (United States)

    Mandić, Vladimir; Markkula, Jouni; Oivo, Markku

    This paper presents results of a literature analysis on Empirical Research Approaches in Software Engineering (SE). The analysis explores reasons why traditional methods, such as statistical hypothesis testing and experiment replication are weakly utilized in the field of SE. It appears that basic assumptions and preconditions of the traditional methods are contradicting the actual situation in the SE. Furthermore, we have identified main issues that should be considered by the researcher when selecting the research approach. In virtue of reasons for weak utilization of traditional methods we propose stronger use of Multi-Method approach with Pragmatism as the philosophical standpoint.

  18. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid ... response, tailing factor and resolution of six replicate injections was < 3 %. ... Cefadroxil monohydrate, Human plasma, Pharmacokinetics Bioequivalence ... Drug-free plasma was obtained from the local .... Influence of probenicid on the renal.

  19. Spectral method and its high performance implementation

    KAUST Repository

    Wu, Zedong

    2014-01-01

    We have presented a new method that can be dispersion free and unconditionally stable. Thus the computational cost and memory requirement will be reduced a lot. Based on this feature, we have implemented this algorithm on GPU based CUDA for the anisotropic Reverse time migration. There is almost no communication between CPU and GPU. For the prestack wavefield extrapolation, it can combine all the shots together to migration. However, it requires to solve a bigger dimensional problem and more meory which can\\'t fit into one GPU cards. In this situation, we implement it based on domain decomposition method and MPI for distributed memory system.

  20. Personality, Study Methods and Academic Performance

    Science.gov (United States)

    Entwistle, N. J.; Wilson, J. D.

    1970-01-01

    A questionnaire measuring four student personality types--stable introvert, unstable introvert, stable extrovert and unstable extrovert--along with the Eysenck Personality Inventory (Form A) were give to 72 graduate students at Aberdeen University and the results showed recognizable interaction between study methods, motivation and personality…

  1. Development of a high performance liquid chromatography method ...

    African Journals Online (AJOL)

    Development of a high performance liquid chromatography method for simultaneous ... Purpose: To develop and validate a new low-cost high performance liquid chromatography (HPLC) method for ..... Several papers have reported the use of ...

  2. Investigation on the performance of bridge approach slab

    Directory of Open Access Journals (Sweden)

    Abdelrahman Amr

    2018-01-01

    Full Text Available In Egypt, where highway bridges are to be constructed on soft cohesive soils, the bridge abutments are usually founded on rigid piles, whereas the earth embankments for the bridge approaches are directly founded on the natural soft ground. Consequently, excessive differential settlement frequently occurs between the bridge deck and the bridge approaches resulting in a “bump” at both ends of the bridge deck. Such a bump not only creates a rough and uncomfortable ride but also represents a hazardous condition to traffic. One effective technique to cope with the bump problem is to use a reinforced concrete approach slab to provide a smooth grade transition between the bridge deck and the approach pavement. Investigating the geotechnical and structural performance of approach slabs and revealing the fundamental affecting factors have become mandatory. In this paper, a 2-D finite element model is employed to investigate the performance of approach slabs. Moreover, an extensive parametric study is carried out to appraise the relatively optimum geometries of approach slab, i.e. slab length, thickness, embedded depth and slope, that can yield permissible bumps. Different geo-mechanical conditions of the cohesive foundation soil and the fill material of the bridge embankment are examined.

  3. THE MANAGEMENT METHODS IN PERFORMANCE SPORTS

    Directory of Open Access Journals (Sweden)

    Silvia GRĂDINARU

    2015-12-01

    Full Text Available Sports are a widespread phenomenon, capable of raising human energies and mobilize financial and material resources that can be difficult compared with those in other areas of social life. Management of sports organizations is influenced and determined by the compliance and requirements arising from the documents issued by international organizations with authority in the field. Organizational development is considered essentially as a strategy to increase organizational effectiveness by determining changes that consider both human resources and organizations. On the whole society, it is accelerated by an industry evolving sport with distinctive features. Its development is conditional on macroeconomics and technology. The complexity of the activities of sports organizations performance, the main laboratory performance national and international sports, requiring a more thorough investigation to enable knowledge of the complex mechanisms of their management and simultaneously identify some optimization solutions throughout the economic-financial and human resources.

  4. Diagnosis of Feedwater Heater Performance Degradation using Fuzzy Approach

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Kang, Yeon Kwan; Heo, Gyunyoung; Song, Seok Yoon

    2014-01-01

    It is inevitable to avoid degradation of component, which operates continuously for long time in harsh environment. Since this degradation causes economical loss and human loss, it is important to monitor and diagnose the degradation of component. The diagnosis requires a well-systematic method for timely decision. Before this article, the methods using regression model and diagnosis table have been proposed to perform the diagnosis study for thermal efficiency in Nuclear Power Plants (NPPs). Since the regression model was numerically less-stable under changes of operating variables, it was difficult to provide good results in operating plants. Contrary to this, the diagnosis table was hard to use due to ambiguous points and to detect how it affects degradation. In order to cover the issues of previous researches, we proposed fuzzy approaches and applied it to diagnose Feedwater Heater (FWH) degradation to check the feasibility. The degradation of FWHs is not easy to be observed, while trouble such as tube leakage may bring simultaneous damage to the tube bundle. This study explains the steps of diagnosing typical failure modes of FWHs. In order to cover the technical issues of previous researches, we adopted fuzzy logic to suggest a diagnosis algorithm for the degradation of FHWs and performed feasibility study. In this paper, total 7 modes of FWH degradation modes are considered, which are High Drain Level, Low Shell Pressure, Tube Pressure Increase, Tube Fouling, Pass Partition Plate Leakage, Tube Leakage, Abnormal venting. From the literature survey and simulation, diagnosis table for FWH is made. We apply fuzzy logic based on diagnosis table. Authors verify fuzzy diagnosis for FWH degradation synthesized the random input sets from made diagnosis table. Comparing previous researches, suggested method more-stable under changes of operating variables, than regression model. On the contrary, the problem which ambiguous points and detect how it affects degradation

  5. Diagnosis of Feedwater Heater Performance Degradation using Fuzzy Approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Kang, Yeon Kwan; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Song, Seok Yoon [Korea Hydro and Nuclear Power, Daejeon (Korea, Republic of)

    2014-05-15

    It is inevitable to avoid degradation of component, which operates continuously for long time in harsh environment. Since this degradation causes economical loss and human loss, it is important to monitor and diagnose the degradation of component. The diagnosis requires a well-systematic method for timely decision. Before this article, the methods using regression model and diagnosis table have been proposed to perform the diagnosis study for thermal efficiency in Nuclear Power Plants (NPPs). Since the regression model was numerically less-stable under changes of operating variables, it was difficult to provide good results in operating plants. Contrary to this, the diagnosis table was hard to use due to ambiguous points and to detect how it affects degradation. In order to cover the issues of previous researches, we proposed fuzzy approaches and applied it to diagnose Feedwater Heater (FWH) degradation to check the feasibility. The degradation of FWHs is not easy to be observed, while trouble such as tube leakage may bring simultaneous damage to the tube bundle. This study explains the steps of diagnosing typical failure modes of FWHs. In order to cover the technical issues of previous researches, we adopted fuzzy logic to suggest a diagnosis algorithm for the degradation of FHWs and performed feasibility study. In this paper, total 7 modes of FWH degradation modes are considered, which are High Drain Level, Low Shell Pressure, Tube Pressure Increase, Tube Fouling, Pass Partition Plate Leakage, Tube Leakage, Abnormal venting. From the literature survey and simulation, diagnosis table for FWH is made. We apply fuzzy logic based on diagnosis table. Authors verify fuzzy diagnosis for FWH degradation synthesized the random input sets from made diagnosis table. Comparing previous researches, suggested method more-stable under changes of operating variables, than regression model. On the contrary, the problem which ambiguous points and detect how it affects degradation

  6. Parallelised Krylov subspace method for reactor kinetics by IQS approach

    International Nuclear Information System (INIS)

    Gupta, Anurag; Modak, R.S.; Gupta, H.P.; Kumar, Vinod; Bhatt, K.

    2005-01-01

    Nuclear reactor kinetics involves numerical solution of space-time-dependent multi-group neutron diffusion equation. Two distinct approaches exist for this purpose: the direct (implicit time differencing) approach and the improved quasi-static (IQS) approach. Both the approaches need solution of static space-energy-dependent diffusion equations at successive time-steps; the step being relatively smaller for the direct approach. These solutions are usually obtained by Gauss-Seidel type iterative methods. For a faster solution, the Krylov sub-space methods have been tried and also parallelised by many investigators. However, these studies seem to have been done only for the direct approach. In the present paper, parallelised Krylov methods are applied to the IQS approach in addition to the direct approach. It is shown that the speed-up obtained for IQS is higher than that for the direct approach. The reasons for this are also discussed. Thus, the use of IQS approach along with parallelised Krylov solvers seems to be a promising scheme

  7. Evaluation of micronozzle performance through DSMC, navier-stokes and coupled dsmc/navier-stokes approaches

    NARCIS (Netherlands)

    Torre, F. la; Kenjeres, S.; Kleijn, C.R.; Moerel, J.L.P.A.

    2009-01-01

    Both the particle based Direct Simulation Monte Carlo (DSMC) method and a compressible Navier-Stokes based continuum method are used to investigate the flow inside micronozzles and to predict the performance of such devices. For the Navier-Stokes approach, both slip and no-slip boundary conditions

  8. Helicopter Gas Turbine Engine Performance Analysis : A Multivariable Approach

    NARCIS (Netherlands)

    Arush, Ilan; Pavel, M.D.

    2017-01-01

    Helicopter performance relies heavily on the available output power of the engine(s) installed. A simplistic single-variable analysis approach is often used within the flight-testing community to reduce raw flight-test data in order to predict the available output power under different atmospheric

  9. Disentangling task and contextual performance : a multitrait-multimethod approach

    NARCIS (Netherlands)

    Demerouti, E.; Xanthopoulou, D.; Tsaousis, I.; Bakker, A.B.

    2014-01-01

    This study among 244 employees and their colleagues working in various sectors investigated the dimensionality of self-ratings and peer-ratings of task and contextual performance, using the scales of Goodman and Svyantek (1999). By applying the multitrait-multimethod approach, we examined the degree

  10. Understanding Performance Management in Schools: A Dialectical Approach

    Science.gov (United States)

    Page, Damien

    2016-01-01

    Purpose: The purpose of this paper is to provide a dialectical framework for the examination of performance management in schools. Design/Methodology/Approach: The paper is based upon a qualitative study of ten headteachers that involved in-depth semi-structured interviews. Findings: The findings identified four dialectical tensions that underpin…

  11. On analyzing colour constancy approach for improving SURF detector performance

    Science.gov (United States)

    Zulkiey, Mohd Asyraf; Zaki, Wan Mimi Diyana Wan; Hussain, Aini; Mustafa, Mohd. Marzuki

    2012-04-01

    Robust key point detector plays a crucial role in obtaining a good tracking feature. The main challenge in outdoor tracking is the illumination change due to various reasons such as weather fluctuation and occlusion. This paper approaches the illumination change problem by transforming the input image through colour constancy algorithm before applying the SURF detector. Masked grey world approach is chosen because of its ability to perform well under local as well as global illumination change. Every image is transformed to imitate the canonical illuminant and Gaussian distribution is used to model the global change. The simulation results show that the average number of detected key points have increased by 69.92%. Moreover, the average of improved performance cases far out weight the degradation case where the former is improved by 215.23%. The approach is suitable for tracking implementation where sudden illumination occurs frequently and robust key point detection is needed.

  12. Cache and memory hierarchy design a performance directed approach

    CERN Document Server

    Przybylski, Steven A

    1991-01-01

    An authoritative book for hardware and software designers. Caches are by far the simplest and most effective mechanism for improving computer performance. This innovative book exposes the characteristics of performance-optimal single and multi-level cache hierarchies by approaching the cache design process through the novel perspective of minimizing execution times. It presents useful data on the relative performance of a wide spectrum of machines and offers empirical and analytical evaluations of the underlying phenomena. This book will help computer professionals appreciate the impact of ca

  13. Uncertainty evaluation methods for waste package performance assessment

    International Nuclear Information System (INIS)

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  14. Multicriterial ranking approach for evaluating bank branch performance

    NARCIS (Netherlands)

    Aleskerov, F; Ersel, H; Yolalan, R

    14 ranking methods based on multiple criteria are suggested for evaluating the performance of the bank branches. The methods are explained via an illustrative example, and some of them are applied to a real-life data for 23 retail bank branches in a large-scale private Turkish commercial bank.

  15. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  16. Building communities through performance: emerging approaches to interculturality.

    Science.gov (United States)

    Parent, Roger

    2009-08-01

    Changing definitions of culture are modifying approaches to intercultural education and training. This paper outlines the principal features of these emerging models for innovation and capacity building in communities. Semiotics provides a theoretical frame for the interdisciplinary analysis of research on cultural competency, especially regarding recent studies on "cultural intelligence", performance and creativity. Interdisciplinary research on cultural literacy is shifting from cultural knowledge to intercultural know-how. This know-how translates into the individual's capacity to innovate and illustrates the influence of culture on individual and group performance. Research on cultural intelligence, performance and creativity provides promising new models for capacity building in communities. These approaches constitute a synthesis of previous research on cultural competency and provide new avenues for innovative social action through intercultural exchange.

  17. Agile Service Development: A Rule-Based Method Engineering Approach

    NARCIS (Netherlands)

    dr. Martijn Zoet; Stijn Hoppenbrouwers; Inge van de Weerd; Johan Versendaal

    2011-01-01

    Agile software development has evolved into an increasingly mature software development approach and has been applied successfully in many software vendors’ development departments. In this position paper, we address the broader agile service development. Based on method engineering principles we

  18. An integrated lean-methods approach to hospital facilities redesign.

    Science.gov (United States)

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  19. Stability over Time of Different Methods of Estimating School Performance

    Science.gov (United States)

    Dumay, Xavier; Coe, Rob; Anumendem, Dickson Nkafu

    2014-01-01

    This paper aims to investigate how stability varies with the approach used in estimating school performance in a large sample of English primary schools. The results show that (a) raw performance is considerably more stable than adjusted performance, which in turn is slightly more stable than growth model estimates; (b) schools' performance…

  20. Approaches to greenhouse gas accounting methods for biomass carbon

    International Nuclear Information System (INIS)

    Downie, Adriana; Lau, David; Cowie, Annette; Munroe, Paul

    2014-01-01

    This investigation examines different approaches for the GHG flux accounting of activities within a tight boundary of biomass C cycling, with scope limited to exclude all other aspects of the lifecycle. Alternative approaches are examined that a) account for all emissions including biogenic CO 2 cycling – the biogenic method; b) account for the quantity of C that is moved to and maintained in the non-atmospheric pool – the stock method; and c) assume that the net balance of C taken up by biomass is neutral over the short-term and hence there is no requirement to include this C in the calculation – the simplified method. This investigation demonstrates the inaccuracies in both emissions forecasting and abatement calculations that result from the use of the simplified method, which is commonly accepted for use. It has been found that the stock method is the most accurate and appropriate approach for use in calculating GHG inventories, however short-comings of this approach emerge when applied to abatement projects, as it does not account for the increase in biogenic CO 2 emissions that are generated when non-CO 2 GHG emissions in the business-as-usual case are offset. Therefore the biogenic method or a modified version of the stock method should be used to accurately estimate GHG emissions abatement achieved by a project. This investigation uses both the derivation of methodology equations from first principles and worked examples to explore the fundamental differences in the alternative approaches. Examples are developed for three project scenarios including; landfill, combustion and slow-pyrolysis (biochar) of biomass. -- Highlights: • Different approaches can be taken to account for the GHG emissions from biomass. • Simplification of GHG accounting methods is useful, however, can lead to inaccuracies. • Approaches used currently are often inadequate for practises that store carbon. • Accounting methods for emissions forecasting can be inadequate for

  1. Performance-approach and performance-avoidance classroom goals and the adoption of personal achievement goals.

    Science.gov (United States)

    Schwinger, Malte; Stiensmeier-Pelster, Joachim

    2011-12-01

    Students' perceptions of classroom goals influence their adoption of personal goals. To assess different forms of classroom goals, recent studies have favoured an overall measure of performance classroom goals, compared to a two-dimensional assessment of performance-approach and performance-avoidance classroom goals (PAVCG). This paper considered the relationship between students' perceptions of classroom goals and their endorsement of personal achievement goals. We proposed that three (instead of only two) classroom goals need to be distinguished. We aimed to provide evidence for this hypothesis by confirmatory factor analysis (CFA) and also by divergent associations between the respective classroom goal and students' personal goal endorsement. A total of 871 (474 female) 10th grade students from several German high schools participated in this study. Students responded to items assessing their perception of mastery, performance-approach, and performance-avoidance goals in the classroom. Additionally, the students reported how much they personally pursue mastery, performance-approach, and performance-avoidance goals. All items referred to German as a specific school subject. RESULTS.A CFA yielded empirical support for the proposed distinction of three (instead of only two) different kinds of classroom goals. Moreover, in hierarchical linear modelling (HLM) analyses all three classroom goals showed unique associations with students' personal goal adoption. The findings emphasized the need to distinguish performance-approach and PAVCG. Furthermore, our results suggest that multiple classroom goals have interactive effects on students' personal achievement strivings. ©2010 The British Psychological Society.

  2. Sustainable Supplier Performance Evaluation and Selection with Neofuzzy TOPSIS Method.

    Science.gov (United States)

    Chaharsooghi, S K; Ashrafi, Mehdi

    2014-01-01

    Supplier selection plays an important role in the supply chain management and traditional criteria such as price, quality, and flexibility are considered for supplier performance evaluation in researches. In recent years sustainability has received more attention in the supply chain management literature with triple bottom line (TBL) describing the sustainability in supply chain management with social, environmental, and economic initiatives. This paper explores sustainability in supply chain management and examines the problem of identifying a new model for supplier selection based on extended model of TBL approach in supply chain by presenting fuzzy multicriteria method. Linguistic values of experts' subjective preferences are expressed with fuzzy numbers and Neofuzzy TOPSIS is proposed for finding the best solution of supplier selection problem. Numerical results show that the proposed model is efficient for integrating sustainability in supplier selection problem. The importance of using complimentary aspects of sustainability and Neofuzzy TOPSIS concept in sustainable supplier selection process is shown with sensitivity analysis.

  3. Adjusted permutation method for multiple attribute decision making with meta-heuristic solution approaches

    Directory of Open Access Journals (Sweden)

    Hossein Karimi

    2011-04-01

    Full Text Available The permutation method of multiple attribute decision making has two significant deficiencies: high computational time and wrong priority output in some problem instances. In this paper, a novel permutation method called adjusted permutation method (APM is proposed to compensate deficiencies of conventional permutation method. We propose Tabu search (TS and particle swarm optimization (PSO to find suitable solutions at a reasonable computational time for large problem instances. The proposed method is examined using some numerical examples to evaluate the performance of the proposed method. The preliminary results show that both approaches provide competent solutions in relatively reasonable amounts of time while TS performs better to solve APM.

  4. Energy Performance of Buildings - The European Approach to Sustainability

    DEFF Research Database (Denmark)

    Heiselberg, Per

    2006-01-01

    This paper presents the European approach to improve sustainability in the building sector, which has a very high potential for considerable reduction of energy consumption in the coming years. By approving the Energy Performance in Buildings Directive the European Union has taken a strong...... leadership role in promoting energy efficiency in buildings in Europe, that will be the most powerful instrument developed to date for the building sector in Europe....

  5. Methods and approaches to prediction in the meat industry

    Directory of Open Access Journals (Sweden)

    A. B. Lisitsyn

    2016-01-01

    Full Text Available The modern stage of the agro-industrial complex is characterized by an increasing complexity, intensification of technological processes of complex processing of materials of animal origin also the need for a systematic analysis of the variety of determining factors and relationships between them, complexity of the objective function of product quality and severe restrictions on technological regimes. One of the main tasks that face the employees of the enterprises of the agro-industrial complex, which are engaged in processing biotechnological raw materials, is the further organizational improvement of work at all stages of the food chain, besides an increase in the production volume. The meat industry as a part of the agro-industrial complex has to use the biological raw materials with maximum efficiency, while reducing and even eliminating losses at all stages of processing; rationally use raw material when selecting a type of processing products; steadily increase quality, biological and food value of products; broaden the assortment of manufactured products in order to satisfy increasing consumer requirements and extend the market for their realization in the conditions of uncertainty of external environment, due to the uneven receipt of raw materials, variations in its properties and parameters, limited time sales and fluctuations in demand for products. The challenges facing the meat industry cannot be solved without changes to the strategy for scientific and technological development of the industry. To achieve these tasks, it is necessary to use the prediction as a method of constant improvement of all technological processes and their performance under the rational and optimal regimes, while constantly controlling quality of raw material, semi-prepared products and finished products at all stages of the technological processing by the physico-chemical, physico-mechanical (rheological, microbiological and organoleptic methods. The paper

  6. Comparison of two methods to determine fan performance curves using computational fluid dynamics

    Science.gov (United States)

    Onma, Patinya; Chantrasmi, Tonkid

    2018-01-01

    This work investigates a systematic numerical approach that employs Computational Fluid Dynamics (CFD) to obtain performance curves of a backward-curved centrifugal fan. Generating the performance curves requires a number of three-dimensional simulations with varying system loads at a fixed rotational speed. Two methods were used and their results compared to experimental data. The first method incrementally changes the mass flow late through the inlet boundary condition while the second method utilizes a series of meshes representing the physical damper blade at various angles. The generated performance curves from both methods are compared with an experiment setup in accordance with the AMCA fan performance testing standard.

  7. A Pattern-Oriented Approach to a Methodical Evaluation of Modeling Methods

    Directory of Open Access Journals (Sweden)

    Michael Amberg

    1996-11-01

    Full Text Available The paper describes a pattern-oriented approach to evaluate modeling methods and to compare various methods with each other from a methodical viewpoint. A specific set of principles (the patterns is defined by investigating the notations and the documentation of comparable modeling methods. Each principle helps to examine some parts of the methods from a specific point of view. All principles together lead to an overall picture of the method under examination. First the core ("method neutral" meaning of each principle is described. Then the methods are examined regarding the principle. Afterwards the method specific interpretations are compared with each other and with the core meaning of the principle. By this procedure, the strengths and weaknesses of modeling methods regarding methodical aspects are identified. The principles are described uniformly using a principle description template according to descriptions of object oriented design patterns. The approach is demonstrated by evaluating a business process modeling method.

  8. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael

    2017-01-01

    , functional forms, estimation methods and derived estimates of the degree of market power. Thereafter, we use our framework to evaluate several structural models based on PTA and GIM to measure oligopsony power in the Ukrainian dairy industry. The PTA-based results suggest that the estimated parameters......This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power...... in agricultural and food markets. We provide a framework that may help researchers to evaluate and improve structural models of market power. Starting with the specification of the approaches in question, we compare published empirical studies of market power with respect to the choice of the applied approach...

  9. The balanced scorecard: an integrative approach to performance evaluation.

    Science.gov (United States)

    Oliveira, J

    2001-05-01

    In addition to strict financial outcomes, healthcare financial managers should assess intangible assets that affect the organization's bottom line, such as clinical processes, staff skills, and patient satisfaction and loyalty. The balanced scorecard, coupled with data-warehousing capabilities, offers a way to measure an organization's performance against its strategic objectives while focusing on building capabilities to achieve these objectives. The balanced scorecard examines performance related to finance, human resources, internal processes, and customers. Because the balanced scorecard requires substantial amounts of data, it is a necessity to establish an organizational data warehouse of clinical, operational, and financial data that can be used in decision support. Because it presents indicators that managers and staff can influence directly by their actions, the balanced-scorecard approach to performance measurement encourages behavioral changes aimed at achieving corporate strategies.

  10. Comparison of wind mill cluster performance: A multicriteria approach

    Energy Technology Data Exchange (ETDEWEB)

    Rajakumar, D.G.; Nagesha, N. [Visvesvaraya Technological Univ., Karnataka (India)

    2012-07-01

    Energy is a crucial input for the economic and social development of any nation. Both renewable and non-renewable energy contribute in meeting the total requirement of the economy. As an affordable and clean energy source, wind energy is amongst the world's fastest growing renewable energy forms. Though there are several wind-mill clusters producing energy in different geographical locations, evaluating their performance is a complex task and not much of literature is available in this area. In this backdrop, an attempt is made in the current paper to estimate the performance of a wind-mill cluster through an index called Cluster Performance Index (CPI) adopting a multi-criteria approach. The proposed CPI comprises four criteria viz., Technical Performance Indicators (TePI), Economic Performance Indicators (EcPI), Environmental Performance Indicators (EnPI), and Sociological Performance Indicators (SoPI). Under each performance criterion a total of ten parameters are considered with five subjective and five objective oriented responses. The methodology is implemented by collecting empirical data from three wind-mill clusters located at Chitradurga, Davangere, and Gadag in the southern Indian State of Karnataka. Totally fifteen different stake holders are consulted through a set of structured researcher administered questionnaire to collect the relevant data in each wind farm. Stake holders involved engineers working in wind farms, wind farm developers, Government officials from energy department and a few selected residential people near the wind farms. The results of the study revealed that Chitradurga wind farm performed much better with a CPI of 45.267 as compared to Gadag (CPI of 28.362) and Davangere (CPI of 19.040) wind farms. (Author)

  11. Assessing vocal performance in complex birdsong: a novel approach.

    Science.gov (United States)

    Geberzahn, Nicole; Aubin, Thierry

    2014-08-06

    Vocal performance refers to the ability to produce vocal signals close to physical limits. Such motor skills can be used by conspecifics to assess a signaller's competitive potential. For example it is difficult for birds to produce repeated syllables both rapidly and with a broad frequency bandwidth. Deviation from an upper-bound regression of frequency bandwidth on trill rate has been widely used to assess vocal performance. This approach is, however, only applicable to simple trilled songs, and even then may be affected by differences in syllable complexity. Using skylarks (Alauda arvensis) as a birdsong model with a very complex song structure, we detected another performance trade-off: minimum gap duration between syllables was longer when the frequency ratio between the end of one syllable and the start of the next syllable (inter-syllable frequency shift) was large. This allowed us to apply a novel measure of vocal performance ¿ vocal gap deviation: the deviation from a lower-bound regression of gap duration on inter-syllable frequency shift. We show that skylarks increase vocal performance in an aggressive context suggesting that this trait might serve as a signal for competitive potential. We suggest using vocal gap deviation in future studies to assess vocal performance in songbird species with complex structure.

  12. New approach to equipment quality evaluation method with distinct functions

    Directory of Open Access Journals (Sweden)

    Milisavljević Vladimir M.

    2016-01-01

    Full Text Available The paper presents new approach for improving method for quality evaluation and selection of equipment (devices and machinery by applying distinct functions. Quality evaluation and selection of devices and machinery is a multi-criteria problem which involves the consideration of numerous parameters of various origins. Original selection method with distinct functions is based on technical parameters with arbitrary evaluation of each parameter importance (weighting. Improvement of this method, presented in this paper, addresses the issue of weighting of parameters by using Delphi Method. Finally, two case studies are provided, which included quality evaluation of standard boilers for heating and evaluation of load-haul-dump (LHD machines, to demonstrate applicability of this approach. Analytical Hierarchical Process (AHP is used as a control method.

  13. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  14. Regulatory approach to enhanced human performance during accidents

    International Nuclear Information System (INIS)

    Palla, R.L. Jr.

    1990-01-01

    It has become increasingly clear in recent years that the risk associated with nuclear power is driven by human performance. Although human errors have contributed heavily to the two core-melt events that have occurred at power reactors, effective performance during an event can also prevent a degraded situation from progressing to a more serious accident, as in the loss-of-feedwater event at Davis-Besse. Sensitivity studies in which human error rates for various categories of errors in a probabilistic risk assessment (PRA) were varied confirm the importance of human performance. Moreover, these studies suggest that actions taken during an accident are at least as important as errors that occur prior to an initiating event. A program that will lead to enhanced accident management capabilities in the nuclear industry is being developed by the US Nuclear Regulatory Commission (NRC) and industry and is a key element in NRC's integration plan for closure of severe-accident issues. The focus of the accident management (AM) program is on human performance during accidents, with emphasis on in-plant response. The AM program extends the defense-in-depth principle to plant operating staff. The goal is to take advantage of existing plant equipment and operator skills and creativity to find ways to terminate accidents that are beyond the design basis. The purpose of this paper is to describe the NRC's objectives and approach in AM as well as to discuss several human performance issues that are central to AM

  15. A method for optimizing the performance of buildings

    DEFF Research Database (Denmark)

    Pedersen, Frank

    2007-01-01

    needed for solving the optimization problem. Furthermore, the algorithm uses so-called domain constraint functions in order to ensure that the input to the simulation software is feasible. Using this technique avoids performing time-consuming simulations for unrealistic design decisions. The algorithm......This thesis describes a method for optimizing the performance of buildings. Design decisions made in early stages of the building design process have a significant impact on the performance of buildings, for instance, the performance with respect to the energy consumption, economical aspects......, and the indoor environment. The method is intended for supporting design decisions for buildings, by combining methods for calculating the performance of buildings with numerical optimization methods. The method is able to find optimum values of decision variables representing different features of the building...

  16. A Gold Standards Approach to Training Instructors to Evaluate Crew Performance

    Science.gov (United States)

    Baker, David P.; Dismukes, R. Key

    2003-01-01

    The Advanced Qualification Program requires that airlines evaluate crew performance in Line Oriented Simulation. For this evaluation to be meaningful, instructors must observe relevant crew behaviors and evaluate those behaviors consistently and accurately against standards established by the airline. The airline industry has largely settled on an approach in which instructors evaluate crew performance on a series of event sets, using standardized grade sheets on which behaviors specific to event set are listed. Typically, new instructors are given a class in which they learn to use the grade sheets and practice evaluating crew performance observed on videotapes. These classes emphasize reliability, providing detailed instruction and practice in scoring so that all instructors within a given class will give similar scores to similar performance. This approach has value but also has important limitations; (1) ratings within one class of new instructors may differ from those of other classes; (2) ratings may not be driven primarily by the specific behaviors on which the company wanted the crews to be scored; and (3) ratings may not be calibrated to company standards for level of performance skill required. In this paper we provide a method to extend the existing method of training instructors to address these three limitations. We call this method the "gold standards" approach because it uses ratings from the company's most experienced instructors as the basis for training rater accuracy. This approach ties the training to the specific behaviors on which the experienced instructors based their ratings.

  17. Applying a social network analysis (SNA) approach to understanding radiologists' performance in reading mammograms

    Science.gov (United States)

    Tavakoli Taba, Seyedamir; Hossain, Liaquat; Heard, Robert; Brennan, Patrick; Lee, Warwick; Lewis, Sarah

    2017-03-01

    Rationale and objectives: Observer performance has been widely studied through examining the characteristics of individuals. Applying a systems perspective, while understanding of the system's output, requires a study of the interactions between observers. This research explains a mixed methods approach to applying a social network analysis (SNA), together with a more traditional approach of examining personal/ individual characteristics in understanding observer performance in mammography. Materials and Methods: Using social networks theories and measures in order to understand observer performance, we designed a social networks survey instrument for collecting personal and network data about observers involved in mammography performance studies. We present the results of a study by our group where 31 Australian breast radiologists originally reviewed 60 mammographic cases (comprising of 20 abnormal and 40 normal cases) and then completed an online questionnaire about their social networks and personal characteristics. A jackknife free response operating characteristic (JAFROC) method was used to measure performance of radiologists. JAFROC was tested against various personal and network measures to verify the theoretical model. Results: The results from this study suggest a strong association between social networks and observer performance for Australian radiologists. Network factors accounted for 48% of variance in observer performance, in comparison to 15.5% for the personal characteristics for this study group. Conclusion: This study suggest a strong new direction for research into improving observer performance. Future studies in observer performance should consider social networks' influence as part of their research paradigm, with equal or greater vigour than traditional constructs of personal characteristics.

  18. Innovative high-performance liquid chromatography method development for the screening of 19 antimalarial drugs based on a generic approach, using design of experiments, independent component analysis and design space.

    Science.gov (United States)

    Debrus, B; Lebrun, P; Kindenge, J Mbinze; Lecomte, F; Ceccato, A; Caliaro, G; Mbay, J Mavar Tayey; Boulanger, B; Marini, R D; Rozet, E; Hubert, Ph

    2011-08-05

    An innovative methodology based on design of experiments (DoE), independent component analysis (ICA) and design space (DS) was developed in previous works and was tested out with a mixture of 19 antimalarial drugs. This global LC method development methodology (i.e. DoE-ICA-DS) was used to optimize the separation of 19 antimalarial drugs to obtain a screening method. DoE-ICA-DS methodology is fully compliant with the current trend of quality by design. DoE was used to define the set of experiments to model the retention times at the beginning, the apex and the end of each peak. Furthermore, ICA was used to numerically separate coeluting peaks and estimate their unbiased retention times. Gradient time, temperature and pH were selected as the factors of a full factorial design. These retention times were modelled by stepwise multiple linear regressions. A recently introduced critical quality attribute, namely the separation criterion (S), was also used to assess the quality of separations rather than using the resolution. Furthermore, the resulting mathematical models were also studied from a chromatographic point of view to understand and investigate the chromatographic behaviour of each compound. Good adequacies were found between the mathematical models and the expected chromatographic behaviours predicted by chromatographic theory. Finally, focusing at quality risk management, the DS was computed as the multidimensional subspace where the probability for the separation criterion to lie in acceptance limits was higher than a defined quality level. The DS was computed propagating the prediction error from the modelled responses to the quality criterion using Monte Carlo simulations. DoE-ICA-DS allowed encountering optimal operating conditions to obtain a robust screening method for the 19 considered antimalarial drugs in the framework of the fight against counterfeit medicines. Moreover and only on the basis of the same data set, a dedicated method for the

  19. Distributed and parallel approach for handle and perform huge datasets

    Science.gov (United States)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  20. Corporate Social Responsibility and Financial Performance: A Two Least Regression Approach

    Directory of Open Access Journals (Sweden)

    Alexander Olawumi Dabor

    2017-12-01

    Full Text Available The objective of this study is to investigate the casuality between corporate social responsibility and firm financial performance. The study employed two least square regression approaches. Fifty-two firms were selected using the scientific method. The findings revealed that corporate social responsibility and firm performance in manufacturing sector are mutually related at 5%. The study recommended that management of manufacturing companies in Nigeria should expend on CSR to boost profitability and corporate image.

  1. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  2. Experiential Approach to Teaching Statistics and Research Methods ...

    African Journals Online (AJOL)

    Statistics and research methods are among the more demanding topics for students of education to master at both the undergraduate and postgraduate levels. It is our conviction that teaching these topics should be combined with real practical experiences. We discuss an experiential teaching/ learning approach that ...

  3. Book Review: Comparative Education Research: Approaches and Methods

    Directory of Open Access Journals (Sweden)

    Noel Mcginn

    2014-10-01

    Full Text Available Book Review Comparative Education Research: Approaches and Methods (2nd edition By Mark Bray, Bob Adamson and Mark Mason (Eds. (2014, 453p ISBN: 978-988-17852-8-2, Hong Kong: Comparative Education Research Centre and Springer

  4. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    Sequential mixed methods research is an effective approach for investigating complex problems, but it has not been extensively used in construction management research. In South Africa, the HIV/AIDS pandemic has seen construction management taking on a vital responsibility since the government called upon the ...

  5. Teaching Psychological Research Methods through a Pragmatic and Programmatic Approach

    Science.gov (United States)

    Rosenkranz, Patrick; Fielden, Amy; Tzemou, Effy

    2014-01-01

    Research methods teaching in psychology is pivotal in preparing students for the transition from student as learner to independent practitioner. We took an action research approach to re-design, implement and evaluate a module guiding students through a programmatic and pragmatic research cycle. These revisions allow students to experience how…

  6. The Feldenkrais Method: A Dynamic Approach to Changing Motor Behavior.

    Science.gov (United States)

    Buchanan, Patricia A.; Ulrich, Beverly D.

    2001-01-01

    Describes the Feldenkrais Method of somatic education, noting parallels with a dynamic systems theory (DST) approach to motor behavior. Feldenkrais uses movement and perception to foster individualized improvement in function. DST explains that a human-environment system continually adapts to changing conditions and assembles behaviors…

  7. A Simulation Modeling Approach Method Focused on the Refrigerated Warehouses Using Design of Experiment

    Science.gov (United States)

    Cho, G. S.

    2017-09-01

    For performance optimization of Refrigerated Warehouses, design parameters are selected based on the physical parameters such as number of equipment and aisles, speeds of forklift for ease of modification. This paper provides a comprehensive framework approach for the system design of Refrigerated Warehouses. We propose a modeling approach which aims at the simulation optimization so as to meet required design specifications using the Design of Experiment (DOE) and analyze a simulation model using integrated aspect-oriented modeling approach (i-AOMA). As a result, this suggested method can evaluate the performance of a variety of Refrigerated Warehouses operations.

  8. Application of controllable unit approach (CUA) to performance-criterion-based nuclear material control and accounting

    International Nuclear Information System (INIS)

    Foster, K.W.; Rogers, D.R.

    1979-01-01

    The Nuclear Regulatory Commission is considering the use of maximum-loss performance criteria as a means of controlling SNM in nuclear plants. The Controllable Unit Approach to material control and accounting (CUA) was developed by Mound to determine the feasibility of controlling a plant to a performance criterion. The concept was tested with the proposed Anderson, SC, mixed-oxide plant, and it was shown that CUA is indeed a feasible method for controlling a complex process to a performance criterion. The application of CUA to an actual low-enrichment plant to assist the NRC in establishing performance criteria for uranium processes is discussed. 5 refs

  9. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  10. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  11. Performance evaluation methods and instrumentation for mine ventilation fans

    Institute of Scientific and Technical Information of China (English)

    LI Man; WANG Xue-rong

    2009-01-01

    Ventilation fans are one of the most important pieces of equipment in coal mines. Their performance plays an important role in the safety of staff and production. Given the actual requirements of coal mine production, we instituted a research project on the measurement methods of key performance parameters such as wind pressure, amount of ventilation and power. At the end a virtual instrument for mine ventilation fans performance evaluation was developed using a USB interface. The practical perform-ance and analytical results of our experiments show that it is feasible, reliable and effective to use the proposed instrumentation for mine ventilation performance evaluation.

  12. An integrated approach to validation of safeguards and security program performance

    International Nuclear Information System (INIS)

    Altman, W.D.; Hunt, J.S.; Hockert, J.W.

    1988-01-01

    Department of Energy (DOE) requirements for safeguards and security programs are becoming increasingly performance oriented. Master Safeguards and Security Agreemtns specify performance levels for systems protecting DOE security interests. In order to measure and validate security system performance, Lawrence Livermore National Laboratory (LLNL) has developed cost effective validation tools and a comprehensive validation approach that synthesizes information gained from different activities such as force on force exercises, limited scope performance tests, equipment testing, vulnerability analyses, and computer modeling; into an overall assessment of the performance of the protection system. The analytic approach employs logic diagrams adapted from the fault and event trees used in probabilistic risk assessment. The synthesis of the results from the various validation activities is accomplished using a method developed by LLNL, based upon Bayes' theorem

  13. Influence of discretization method on the digital control system performance

    Directory of Open Access Journals (Sweden)

    Futás József

    2003-12-01

    Full Text Available The design of control system can be divided into two steps. First the process or plant have to be convert into mathematical model form, so that its behavior can be analyzed. Then an appropriate controller have to be design in order to get the desired response of the controlled system. In the continuous time domain the system is represented by differential equations. Replacing a continuous system into discrete time form is always an approximation of the continuous system. The different discretization methods give different digital controller performance. The methods presented on the paper are Step Invariant or Zero Order Hold (ZOH Method, Matched Pole-Zero Method, Backward difference Method and Bilinear transformation. The above mentioned discretization methods are used in developing PI position controller of a dc motor. The motor model was converted by the ZOH method. The performances of the different methods are compared and the results are presented.

  14. A PRACTICAL APPROACH TO THE GROUND OSCILLATION VELOCITY MEASUREMENT METHOD

    Directory of Open Access Journals (Sweden)

    Siniša Stanković

    2017-01-01

    Full Text Available The use of an explosive’s energy during blasting includes undesired effects on the environment. The seismic influence of a blast, as a major undesired effect, is determined by many national standards, recommendations and calculations where the main parameter is ground oscillation velocity at the field measurement location. There are a few approaches and methods for calculation of expected ground oscillation velocities according to charge weight per delay and the distance from the blast to the point of interest. Utilizations of these methods and formulas do not provide satisfactory results, thus the measured values on diverse distance from the blast field more or less differ from values given by previous calculations. Since blasting works are executed in diverse geological conditions, the aim of this research is the development of a practical and reliable approach which will give a different model for each construction site where blasting works have been or will be executed. The approach is based on a greater number of measuring points in line from the blast field at predetermined distances. This new approach has been compared with other generally used methods and formulas through the use of measurements taken during research along with measurements from several previously executed projects. The results confirmed that the suggested model gives more accurate values.

  15. The RISMC approach to perform advanced PRA analyses - 15332

    International Nuclear Information System (INIS)

    Mandelli, D.; Smith, C.; Riley, T.; Nielsen, J.; Alfonsi, A.; Rabiti, C.; Cogliati, J.

    2015-01-01

    The existing fleet of nuclear power plants is in the process of extending its lifetime and increasing the power generated from these plants via power up-rates. In order to evaluate the impact of these two factors on the safety of the plant, the RISMC (Risk Informed Safety Margin Characterization) Pathway aims to develop simulation-based tools and methods to assess risks for existing nuclear power plants in order to optimize safety. This pathway, by developing new methods, is extending the state-of-the-practice methods that have been traditionally based on logic structures such as Event-Trees and Fault-Trees. These static types of models mimic system response in an inductive and deductive way respectively, yet are restrictive in the ways they can represent spatial and temporal constructs. RISMC analyses are performed by using a combination of thermal-hydraulic codes and a stochastic analysis tool (RAVEN)currently under development at the Idaho National Laboratory. This paper presents a case study in order to show the capabilities of the RISMC methodology to assess impact of power up-rate of a boiling water reactor system during a station blackout accident scenario. We employ the system simulator code, RELAP5-3D, coupled with RAVEN which perform the stochastic analysis. Our analysis is in fact performed by: 1) sampling values of a set of parameters from the uncertainty space of interest, 2) simulating the system behavior for that specific set of parameter values and 3) analyzing the set of simulation runs. Results obtained give a detailed investigation of the issues associated with a plant power up-rate including the effects of station blackout accident scenarios. We are able to quantify how the timing of specific events was impacted by a higher nominal reactor core power. Such safety insights can provide useful information to the decision makers to perform risk informed margins management

  16. Budgetary Approach to Project Management by Percentage of Completion Method

    Directory of Open Access Journals (Sweden)

    Leszek Borowiec

    2011-07-01

    Full Text Available Efficient and effective project management process is made possible by the use of methods and techniques of project management. The aim of this paper is to present the problems of project management by using Percentage of Completion method. The research material was gathered based on the experience in implementing this method by the Johnson Controls International Company. The article attempts to demonstrate the validity of the thesis that the POC project management method, allows for effective implementation and monitoring of the project and thus is an effective tool in the managing of companies which exploit the budgetary approach. The study presents planning process of basic parameters affecting the effectiveness of the project (such as costs, revenue, margin and characterized how the primary measurements used to evaluate it. The present theme is illustrating by numerous examples for showing the essence of the raised problems and the results are presenting by using descriptive methods, graphical and tabular.

  17. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    Science.gov (United States)

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  18. Environmental investment and firm performance: A network approach

    International Nuclear Information System (INIS)

    Bostian, Moriah; Färe, Rolf; Grosskopf, Shawna; Lundgren, Tommy

    2016-01-01

    This study examines the role of investment in environmental production practices for both environmental performance and energy efficiency over time. We employ a network DEA approach that links successive production technologies through intertemporal investment decisions with a period by period estimation. This allows us to estimate energy efficiency and environmental performance separately, as well as productivity change and its associated decompositions into efficiency change and technology change. Incorporating a network model also allows us to account for both short-term environmental management practices and long-term environmental investments in each of our productivity measures. We apply this framework to a panel of detailed plant-level production data for Swedish manufacturing firms covering the years 2002–2008. - Highlights: • We use a network DEA model to account for intertemporal environmental investment decisionsin measures of firm productivity. • We apply our network technology model to a panel of firms in Sweden's pulp and paperindustry for the years 2002 - 2008. • We model environmental investments and expenditures separately from other productionoriented inputs. • We find evidence of positive relationships between energy efficiency, environmental performance, and firm productivity.

  19. The use of mixed-methods research to diagnose the organisational performance of a local government

    Directory of Open Access Journals (Sweden)

    Benjamin H. Olivier

    2017-07-01

    Full Text Available Orientation: The majority of local governments in South Africa are underperforming; a first step to improve their performance is to accurately diagnose their current functioning. The utilisation of a mixed-methods approach for this diagnosis based on a valid model of organisational performance will form a better and holistic understanding of how a local government is performing. Research purpose: The aim of this study is to investigate the utility of mixed-methods research as a diagnostic approach for determining the organisational performance of a local government in South Africa. Motivation for the study: The use of either quantitative or qualitative data gathering in isolation as part of an organisational diagnosis can lead to biased information and not identifying the root causes of problems. The use of mixed-methods research in which both quantitative and qualitative data gathering methods are utilised has been shown to produce numerous benefits, such as confirmation of gathered data, providing richer detail and initiating new lines of thinking. Such multiple methodologies are recognised as an essential component of any organisational diagnosis and can be an effective means of eliminating biases in singular data gathering methods. Research design, approach and method: A concurrent transformative mixed-methods strategy based on the Burke–Litwin model of organisational performance with triangulation of results and findings to determine convergence validity was used. A convenience sample of 116 (N = 203 permanent officials in a rural district municipality in South Africa completed a survey questionnaire and were also individually interviewed. Main findings: Results indicate that mixed-methods research is a valid technique for establishing the integrity of survey data and for providing a better and holistic understanding of the functioning of an organisation. The results also indicate that the Burke–Litwin model is a useful and valid

  20. Performance prediction method for a multi-stage Knudsen pump

    Science.gov (United States)

    Kugimoto, K.; Hirota, Y.; Kizaki, Y.; Yamaguchi, H.; Niimi, T.

    2017-12-01

    In this study, the novel method to predict the performance of a multi-stage Knudsen pump is proposed. The performance prediction method is carried out in two steps numerically with the assistance of a simple experimental result. In the first step, the performance of a single-stage Knudsen pump was measured experimentally under various pressure conditions, and the relationship of the mass flow rate was obtained with respect to the average pressure between the inlet and outlet of the pump and the pressure difference between them. In the second step, the performance of a multi-stage pump was analyzed by a one-dimensional model derived from the mass conservation law. The performances predicted by the 1D-model of 1-stage, 2-stage, 3-stage, and 4-stage pumps were validated by the experimental results for the corresponding number of stages. It was concluded that the proposed prediction method works properly.

  1. A mixed-methods approach to systematic reviews.

    Science.gov (United States)

    Pearson, Alan; White, Heath; Bath-Hextall, Fiona; Salmond, Susan; Apostolo, Joao; Kirkpatrick, Pamela

    2015-09-01

    There are an increasing number of published single-method systematic reviews that focus on different types of evidence related to a particular topic. As policy makers and practitioners seek clear directions for decision-making from systematic reviews, it is likely that it will be increasingly difficult for them to identify 'what to do' if they are required to find and understand a plethora of syntheses related to a particular topic.Mixed-methods systematic reviews are designed to address this issue and have the potential to produce systematic reviews of direct relevance to policy makers and practitioners.On the basis of the recommendations of the Joanna Briggs Institute International Mixed Methods Reviews Methodology Group in 2012, the Institute adopted a segregated approach to mixed-methods synthesis as described by Sandelowski et al., which consists of separate syntheses of each component method of the review. Joanna Briggs Institute's mixed-methods synthesis of the findings of the separate syntheses uses a Bayesian approach to translate the findings of the initial quantitative synthesis into qualitative themes and pooling these with the findings of the initial qualitative synthesis.

  2. A new approach for peat inventory methods; Turvetutkimusten menetelmaekehitystarkastelu

    Energy Technology Data Exchange (ETDEWEB)

    Laatikainen, M.; Leino, J.; Lerssi, J.; Torppa, J.; Turunen, J. Email: jukka.turunen@gtk.fi

    2011-07-01

    Development of the new peatland inventory method started in 2009. There was a need to investigate whether new methods and tools could be developed cost-effectively so field inventory work would more completely cover the whole peatland area and the quality and liability of the final results would remain at a high level. The old inventory method in place at the Geological Survey of Finland (GTK) is based on the main transect and cross transect approach across a peatland area. The goal of this study was to find a practical grid-based method linked to the geographic information system suitable for field conditions. the triangle-grid method with even distance between the study points was found to be the most suitable approach. A new Ramac-ground penetrating radar was obtained by the GTK in 2009, and it was concluded in the study of new peatland inventory methods. This radar model is relatively light and very suitable, for example, to the forestry drained peatlands, which are often difficult to cross because of the intensive ditch network. the goal was to investigate the best working methods for the ground penetrating radar to optimize its use in the large-scale peatland inventory. Together with the new field inventory methods, a novel interpolation-based method (MITTI) for modelling peat depths was developed. MITTI makes it possible to take advantage of all the available peat-depth data including, at the moment, aerogeophysical and ground penetrating radar measurements, drilling data and the mire outline. The characteristic uncertainties of each data type are taken into account and, in addition to the depth model itself, an uncertainty map of the model is computed. Combined with the grid-based field inventory method, this multi-approach provides better tools to more accurately estimate the peat depths, peat amounts and peat type distributions. The development of the new peatland inventory method was divided into four separate sections: (1) Development of new field

  3. Learning approaches as predictors of academic performance in first year health and science students.

    Science.gov (United States)

    Salamonson, Yenna; Weaver, Roslyn; Chang, Sungwon; Koch, Jane; Bhathal, Ragbir; Khoo, Cheang; Wilson, Ian

    2013-07-01

    To compare health and science students' demographic characteristics and learning approaches across different disciplines, and to examine the relationship between learning approaches and academic performance. While there is increasing recognition of a need to foster learning approaches that improve the quality of student learning, little is known about students' learning approaches across different disciplines, and their relationships with academic performance. Prospective, correlational design. Using a survey design, a total of 919 first year health and science students studying in a university located in the western region of Sydney from the following disciplines were recruited to participate in the study - i) Nursing: n = 476, ii) Engineering: n = 75, iii) Medicine: n = 77, iv) Health Sciences: n = 204, and v) Medicinal Chemistry: n = 87. Although there was no statistically significant difference in the use of surface learning among the five discipline groups, there were wide variations in the use of deep learning approach. Furthermore, older students and those with English as an additional language were more likely to use deep learning approach. Controlling for hours spent in paid work during term-time and English language usage, both surface learning approach (β = -0.13, p = 0.001) and deep learning approach (β = 0.11, p = 0.009) emerged as independent and significant predictors of academic performance. Findings from this study provide further empirical evidence that underscore the importance for faculty to use teaching methods that foster deep instead of surface learning approaches, to improve the quality of student learning and academic performance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Methodical approaches to development of classification state methods of regulation business activity in fishery

    OpenAIRE

    She Son Gun

    2014-01-01

    Approaches to development of classification of the state methods of regulation of economy are considered. On the basis of the provided review the complex method of state regulation of business activity is reasonable. The offered principles allow improving public administration and can be used in industry concepts and state programs on support of small business in fishery.

  5. Creative Approaches to Teaching Graduate Research Methods Workshops

    OpenAIRE

    Peter Reilly

    2017-01-01

    Engagement and deeper learning were enhanced by developing several innovative teaching strategies delivered in Research Methods workshops to Graduate Business Students.  Focusing primarily on students adopting a creative approach to formulating a valid research question for undertaking a dissertation successfully. These techniques are applicable to most subject domains to ensure student engagement.  Addressing the various multiple intelligences and learning styles existing within groups while...

  6. Methodical approach to financial stimulation of logistics managers

    OpenAIRE

    Melnykova Kateryna V.

    2014-01-01

    The article offers a methodical approach to financial stimulation of logistics managers, which allows calculation of the incentive amount with consideration of profit obtained from introduction of optimisation logistics solutions. The author generalises measures, which would allow increase of stimulation of labour of logistics managers by the enterprise top managers. The article marks out motivation factors, which exert influence upon relation of logistics managers to execution of optimisatio...

  7. PERFORMANCE EVALUATION OF TURKISH TYPE A MUTUAL FUNDS AND PENSION STOCK FUNDS BY USING TOPSIS METHOD

    Directory of Open Access Journals (Sweden)

    Nesrin ALPTEKIN

    2009-07-01

    Full Text Available In this paper, it is evaluated performance of Turkish Type A mutual funds and pension stock funds by using TOPSIS method which is a multicriteria decision making approach. Both of these funds compose of stocks in their portfolios, so it can be enabled to compare each other. Generally, mutual or pension funds are evaluated according to their risk and return. At this point, it is used traditional performance measurement techniques of funds like Sharpe ratio, Sortino ratio, Treynor index and Jensen’s alpha. TOPSIS method takes into consideration all of these fund performance measurement techniques and provides more reasonable performance measurement.

  8. Buffer-Free High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a simple, economical and reproducible high performance liquid chromatographic (HPLC) method for the determination of theophylline in pharmaceutical dosage forms. Method: Caffeine was used as the internal standard and reversed phase C-18 column was used to elute the drug and ...

  9. High-performance parallel approaches for three-dimensional light detection and ranging point clouds gridding

    Science.gov (United States)

    Rizki, Permata Nur Miftahur; Lee, Heezin; Lee, Minsu; Oh, Sangyoon

    2017-01-01

    With the rapid advance of remote sensing technology, the amount of three-dimensional point-cloud data has increased extraordinarily, requiring faster processing in the construction of digital elevation models. There have been several attempts to accelerate the computation using parallel methods; however, little attention has been given to investigating different approaches for selecting the most suited parallel programming model for a given computing environment. We present our findings and insights identified by implementing three popular high-performance parallel approaches (message passing interface, MapReduce, and GPGPU) on time demanding but accurate kriging interpolation. The performances of the approaches are compared by varying the size of the grid and input data. In our empirical experiment, we demonstrate the significant acceleration by all three approaches compared to a C-implemented sequential-processing method. In addition, we also discuss the pros and cons of each method in terms of usability, complexity infrastructure, and platform limitation to give readers a better understanding of utilizing those parallel approaches for gridding purposes.

  10. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  11. A Thermographic Measurement Approach to Assess Supercapacitor Electrical Performances

    Directory of Open Access Journals (Sweden)

    Stanislaw Galla

    2017-12-01

    Full Text Available This paper describes a proposal for the qualitative assessment of condition of supercapacitors based on the conducted thermographic measurements. The presented measurement stand was accompanied by the concept of methodology of performing tests. Necessary conditions, which were needed to minimize the influence of disturbing factors on the performance of thermal imaging measurements, were also indicated. Mentioned factors resulted from both: the hardware limitations and from the necessity to prepare samples. The algorithm that was used to determine the basic parameters for assessment has been presented. The article suggests to use additional factors that may facilitate the analysis of obtained results. Measuring the usefulness of the proposed methodology was tested on commercial samples of supercapacitors. All of the tests were taken in conjunction with the classical methods based on capacitance (C and equivalent series resistance (ESR measurements, which were also presented in the paper. Selected results presenting the observed changes occurring in both: basic parameters of supercapacitors and accompanying fluctuations of thermal fields, along with analysis, were shown. The observed limitations of the proposed assessment method and the suggestions for its development were also described.

  12. METHODICAL APPROACHES TO THE COST MANAGEMENT OF INDUSTRIAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Trunina Iryna

    2018-03-01

    Full Text Available Introduction. The paper deals with the actual issues of managing the costs of industrial enterprises, because in the conditions of an unstable market environment the financial performance depends on the efficiency of the cost management system, competitiveness, financial sustainability and investment attractiveness of any subject of economic activity. Purpose of the article is analys is of approaches to cost management, theoretical substantiation and development of recommendations regarding the formation of strategic cost management. Results. The economic content of cost management in the treatment of different authors and on different approaches: functional, process-oriented and system approaches has been considered. Their essence and features, the direction for operational or strategic management of expenses of the enterprise, ways of spending management in different approaches are determined. It is stated that all considered approaches to cost management of enterprises are aimed at optimal use of resources and ensuring the growth of the efficiency of enterprises. Conclusions. Based on the review of methodological approaches to cost management, recommendations are developed for expanding the implementation of cost management at various levels of enterprise management and the formation of strategic cost management within the framework of strategic management of an enterprise. The strategic cost management is complex category aimed at achieving a rational level of costs in the long run, which allows for the consideration of competitive cost advantages and increase the competitiveness of an industrial enterprise. The implementation of cost reduction strategies should be a constant and important part of the company’s work, while the strategy of cost reduction should be integrated into the overall business strategy of the enterprise.

  13. A multi-method approach to evaluate health information systems.

    Science.gov (United States)

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  14. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Naser Gholi Sarli

    2013-10-01

    Full Text Available Abstract One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year organic patterns of evolution great poets and writers literary emblems and evaluations of every period events, concepts and periods of general or political history analogy of literary history and history of ideas or history of arts approaches and styles of language dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change. Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  15. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Dr. N. Gh. Sarli

    Full Text Available One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year; organic patterns of evolution; great poets and writers; literary emblems and evaluations of every period; events, concepts and periods of general or political history; analogy of literary history and history of ideas or history of arts; approaches and styles of language; dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change.Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  16. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Naser Gholi Sarli

    2013-11-01

    Full Text Available Abstract One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year organic patterns of evolution great poets and writers literary emblems and evaluations of every period events, concepts and periods of general or political history analogy of literary history and history of ideas or history of arts approaches and styles of language dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change. Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  17. A suggested approach toward measuring sorption and applying sorption data to repository performance assessment

    International Nuclear Information System (INIS)

    Rundberg, R.S.

    1992-01-01

    The prediction of radionuclide migration for the purpose of assessing the safety of a nuclear waste repository will be based on a collective knowledge of hydrologic and geochemical properties of the surrounding rock and groundwater. This knowledge along with assumption about the interactions of radionuclides with groundwater and minerals form the scientific basis for a model capable of accurately predicting the repository's performance. Because the interaction of radionuclides in geochemical systems is known to be complicated, several fundamental and empirical approaches to measuring the interaction between radionuclides and the geologic barrier have been developed. The approaches applied to the measurement of sorption involve the use of pure minerals, intact, or crushed rock in dynamic and static experiments. Each approach has its advantages and disadvantages. There is no single best method for providing sorption data for performance assessment models which can be applied without invoking information derived from multiple experiments. 53 refs., 12 figs

  18. Basis set approach in the constrained interpolation profile method

    International Nuclear Information System (INIS)

    Utsumi, T.; Koga, J.; Yabe, T.; Ogata, Y.; Matsunaga, E.; Aoki, T.; Sekine, M.

    2003-07-01

    We propose a simple polynomial basis-set that is easily extendable to any desired higher-order accuracy. This method is based on the Constrained Interpolation Profile (CIP) method and the profile is chosen so that the subgrid scale solution approaches the real solution by the constraints from the spatial derivative of the original equation. Thus the solution even on the subgrid scale becomes consistent with the master equation. By increasing the order of the polynomial, this solution quickly converges. 3rd and 5th order polynomials are tested on the one-dimensional Schroedinger equation and are proved to give solutions a few orders of magnitude higher in accuracy than conventional methods for lower-lying eigenstates. (author)

  19. Information operator approach and iterative regularization methods for atmospheric remote sensing

    International Nuclear Information System (INIS)

    Doicu, A.; Hilgers, S.; Bargen, A. von; Rozanov, A.; Eichmann, K.-U.; Savigny, C. von; Burrows, J.P.

    2007-01-01

    In this study, we present the main features of the information operator approach for solving linear inverse problems arising in atmospheric remote sensing. This method is superior to the stochastic version of the Tikhonov regularization (or the optimal estimation method) due to its capability to filter out the noise-dominated components of the solution generated by an inappropriate choice of the regularization parameter. We extend this approach to iterative methods for nonlinear ill-posed problems and derive the truncated versions of the Gauss-Newton and Levenberg-Marquardt methods. Although the paper mostly focuses on discussing the mathematical details of the inverse method, retrieval results have been provided, which exemplify the performances of the methods. These results correspond to the NO 2 retrieval from SCIAMACHY limb scatter measurements and have been obtained by using the retrieval processors developed at the German Aerospace Center Oberpfaffenhofen and Institute of Environmental Physics of the University of Bremen

  20. Multicore Performance of Block Algebraic Iterative Reconstruction Methods

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik B.; Hansen, Per Christian

    2014-01-01

    Algebraic iterative methods are routinely used for solving the ill-posed sparse linear systems arising in tomographic image reconstruction. Here we consider the algebraic reconstruction technique (ART) and the simultaneous iterative reconstruction techniques (SIRT), both of which rely on semiconv......Algebraic iterative methods are routinely used for solving the ill-posed sparse linear systems arising in tomographic image reconstruction. Here we consider the algebraic reconstruction technique (ART) and the simultaneous iterative reconstruction techniques (SIRT), both of which rely...... on semiconvergence. Block versions of these methods, based on a partitioning of the linear system, are able to combine the fast semiconvergence of ART with the better multicore properties of SIRT. These block methods separate into two classes: those that, in each iteration, access the blocks in a sequential manner...... a fixed relaxation parameter in each method, namely, the one that leads to the fastest semiconvergence. Computational results show that for multicore computers, the sequential approach is preferable....

  1. Development of exploratory approach for scenario analysis in the performance assessment of geological disposal

    International Nuclear Information System (INIS)

    Makino, Hitoshi; Ishiguro, Katsuhiko; Umeki, Hiroyuki; Oyamada, Kiyoshi; Takase, Hiroyasu; Grindrod, Peter

    1998-01-01

    It becomes difficult to apply the ordinary method for scenario analysis as number of the processes and complexity in their interrelations are increased. For this problem, an exploratory approach, that can perform scenario analysis on wider range of problems, was developed. The approach includes ensemble runs of a mass transport model, that was developed as a generic and flexible model and can cover effects of various processes on the mass transport, and analysis of sensitivity structure among the input and output space of the ensemble runs. The technique of clustering and principal component analysis were applied in the approach. As the result of its test application, applicability of the approach was confirmed to identify important processes from number of the processes in the systematic and objective manner. (author)

  2. Creative Approaches to Teaching Graduate Research Methods Workshops

    Directory of Open Access Journals (Sweden)

    Peter Reilly

    2017-06-01

    Full Text Available Engagement and deeper learning were enhanced by developing several innovative teaching strategies delivered in Research Methods workshops to Graduate Business Students.  Focusing primarily on students adopting a creative approach to formulating a valid research question for undertaking a dissertation successfully. These techniques are applicable to most subject domains to ensure student engagement.  Addressing the various multiple intelligences and learning styles existing within groups while ensuring these sessions are student centred and conducive to a collaborative learning environment.  Blogs, interactive tutorials, online videos, games and posters, are used to develop student’s cognitive and metacognitive abilities.  Using novelty images appeals to a groups’ intellectual curiosity, acting as an interpretive device to explain  the value of adopting a holistic rather than analytic approach towards a topic.

  3. Methods for measuring denitrification: Diverse approaches to a difficult problem

    DEFF Research Database (Denmark)

    Groffman, Peter M.; Altabet, Mark A.; Böhlke, J. K.

    2006-01-01

    , and global scales. Unfortunately, this process is very difficult to measure, and existing methods are problematic for different reasons in different places at different times. In this paper, we review the major approaches that have been taken to measure denitrification in terrestrial and aquatic environments...... based on stable isotopes, (8) in situ gradients with atmospheric environmental tracers, and (9) molecular approaches. Our review makes it clear that the prospects for improved quantification of denitrification vary greatly in different environments and at different scales. While current methodology allows...... for the production of accurate estimates of denitrification at scales relevant to water and air quality and ecosystem fertility questions in some systems (e.g., aquatic sediments, well-defined aquifers), methodology for other systems, especially upland terrestrial areas, still needs development. Comparison of mass...

  4. A method for performance assessment of medical radioisotope equipment

    International Nuclear Information System (INIS)

    Kerin, T.; Slavtchev, Ath.; Nedeltchev, M.; Kjurktchiev, T.

    1984-01-01

    A variety of tests and procedures exist for the performance assessment of radioisotope diagnostic equipment. The complex performance index which has been introduced to date is based on an heuristic approach. The present work tries to interconnect algorithmically the most important factors as the influence of the measurement geometry, the statistic peculiarities for lower activities and the information loss at high count rates. All this is reflected in a criterion which integrates the spatial resolution, the effective detector's field of vision, the radionuclide's sensitivity, the background count rate and the effective dead-time of the system under investigation. (Auth.)

  5. A Shot Number Based Approach to Performance Analysis in Table Tennis

    Directory of Open Access Journals (Sweden)

    Tamaki Sho

    2017-01-01

    Full Text Available The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method.

  6. PRODUCTIVITY PERFORMANCE OF ESTONIA IN A GROWTH ACCOUNTING APPROACH

    Directory of Open Access Journals (Sweden)

    Vivien MOLNAR

    2016-12-01

    Full Text Available This paper aims to contribute to a better understanding of the economic growth tendencies in Estonia and other formal post-socialist countries and the interaction between productivity growth and their determinants after the transition decades. So this paper is structured as follows. Firstly we will introduce an alternative growth accounting method to determine the components of productivity growth based on this concept. In Section we will also provide our empirical results in Estonia, Latvia, Lithuania and Hungary compared to the EU-15 countries between 1990 and 2011 how TFP (Total Factor Productivity, Physical and Labour Capital Accumulation can contribute to (increase or decrease economic performance of each country. Finally, we can conclude that the relationship between labour and output growth per capita has obviously and temporarily changed after the mid-1990s, which could be determined by the increasing role of such socio-economic factors as technological changes, capital accumulation and demographical fluctuations etc.

  7. The performances of R GPU implementations of the GMRES method

    Directory of Open Access Journals (Sweden)

    Bogdan Oancea

    2018-03-01

    Full Text Available Although the performance of commodity computers has improved drastically with the introduction of multicore processors and GPU computing, the standard R distribution is still based on single-threaded model of computation, using only a small fraction of the computational power available now for most desktops and laptops. Modern statistical software packages rely on high performance implementations of the linear algebra routines there are at the core of several important leading edge statistical methods. In this paper we present a GPU implementation of the GMRES iterative method for solving linear systems. We compare the performance of this implementation with a pure single threaded version of the CPU. We also investigate the performance of our implementation using different GPU packages available now for R such as gmatrix, gputools or gpuR which are based on CUDA or OpenCL frameworks.

  8. Application of Grey-TOPSIS approach to evaluate value chain performance of tea processing chains

    Directory of Open Access Journals (Sweden)

    Richard Nyaoga

    2016-03-01

    Full Text Available This study develops an effective method to measure value chain performance and rank them based on qualitative criteria and to determine the ranking order of the various forms of performance under study. This approach integrates the advantage of grey systems theory and TOPSIS to evaluate and rank value chain performance. Grey-TOPSIS approach has been applied to measure and rank the value chain performance of various firms. The results indicate that the proposed model is useful to facilitate multi-criteria decision-making (MCDM problem under the environment of uncertainty and vagueness. The model also provides an appropriate ranking order based on the available alternatives. The Grey-TOPSIS approach that will be useful to the managers to use for solving the similar type of decision-making problems in their firms in the future has been discussed. Even though, the problem of choosing a suitable performance option is often addressed in practice and research, very few studies are available in the literature of Grey-TOPSIS decision models. Also, Grey-TOPSIS model application in the tea processing firms is non-existence hence this study is the very first to apply this model in evaluating value chain performance in the tea processing firms.

  9. Evaluation Method for Low-Temperature Performance of Lithium Battery

    Science.gov (United States)

    Wang, H. W.; Ma, Q.; Fu, Y. L.; Tao, Z. Q.; Xiao, H. Q.; Bai, H.; Bai, H.

    2018-05-01

    In this paper, the evaluation method for low temperature performance of lithium battery is established. The low temperature performance level was set up to determine the best operating temperature range of the lithium battery using different cathode materials. Results are shared with the consumers for the proper use of lithium battery to make it have a longer service life and avoid the occurrence of early rejection.

  10. Resource Isolation Method for Program’S Performance on CMP

    Science.gov (United States)

    Guan, Ti; Liu, Chunxiu; Xu, Zheng; Li, Huicong; Ma, Qiang

    2017-10-01

    Data center and cloud computing are more popular, which make more benefits for customers and the providers. However, in data center or clusters, commonly there is more than one program running on one server, but programs may interference with each other. The interference may take a little effect, however, the interference may cause serious drop down of performance. In order to avoid the performance interference problem, the mechanism of isolate resource for different programs is a better choice. In this paper we propose a light cost resource isolation method to improve program’s performance. This method uses Cgroups to set the dedicated CPU and memory resource for a program, aiming at to guarantee the program’s performance. There are three engines to realize this method: Program Monitor Engine top program’s resource usage of CPU and memory and transfer the information to Resource Assignment Engine; Resource Assignment Engine calculates the size of CPU and memory resource should be applied for the program; Cgroups Control Engine divide resource by Linux tool Cgroups, and drag program in control group for execution. The experiment result show that making use of the resource isolation method proposed by our paper, program’s performance can be improved.

  11. The New Performance Calculation Method of Fouled Axial Flow Compressor

    Directory of Open Access Journals (Sweden)

    Huadong Yang

    2014-01-01

    Full Text Available Fouling is the most important performance degradation factor, so it is necessary to accurately predict the effect of fouling on engine performance. In the previous research, it is very difficult to accurately model the fouled axial flow compressor. This paper develops a new performance calculation method of fouled multistage axial flow compressor based on experiment result and operating data. For multistage compressor, the whole compressor is decomposed into two sections. The first section includes the first 50% stages which reflect the fouling level, and the second section includes the last 50% stages which are viewed as the clean stage because of less deposits. In this model, the performance of the first section is obtained by combining scaling law method and linear progression model with traditional stage stacking method; simultaneously ambient conditions and engine configurations are considered. On the other hand, the performance of the second section is calculated by averaged infinitesimal stage method which is based on Reynolds’ law of similarity. Finally, the model is successfully applied to predict the 8-stage axial flow compressor and 16-stage LM2500-30 compressor. The change of thermodynamic parameters such as pressure ratio, efficiency with the operating time, and stage number is analyzed in detail.

  12. A Method To ModifyCorrect The Performance Of Amplifiers

    Directory of Open Access Journals (Sweden)

    Rohith Krishnan R

    2015-01-01

    Full Text Available Abstract The actual response of the amplifier may vary with the replacement of some aged or damaged components and this method is to compensate that problem. Here we use op-amp Fixator as the design tool. The tool helps us to isolate the selected circuit component from rest of the circuit adjust its operating point to correct the performance deviations and to modify the circuit without changing other parts of the circuit. A method to modifycorrect the performance of amplifiers by properly redesign the circuit is presented in this paper.

  13. Roles and methods of performance evaluation of hospital academic leadership.

    Science.gov (United States)

    Zhou, Ying; Yuan, Huikang; Li, Yang; Zhao, Xia; Yi, Lihua

    2016-01-01

    The rapidly advancing implementation of public hospital reform urgently requires the identification and classification of a pool of exceptional medical specialists, corresponding with incentives to attract and retain them, providing a nucleus of distinguished expertise to ensure public hospital preeminence. This paper examines the significance of academic leadership, from a strategic management perspective, including various tools, methods and mechanisms used in the theory and practice of performance evaluation, and employed in the selection, training and appointment of academic leaders. Objective methods of assessing leadership performance are also provided for reference.

  14. Investigation of Thermal Performance for Atria: a Method Overview

    Directory of Open Access Journals (Sweden)

    Moosavi Leila

    2016-01-01

    Full Text Available The importance of low energy design in large buildings has encouraged researchers to implement different methods for predicting a building’s thermal performance. Atria, as energy efficient features, have been implemented to improve the indoor thermal environment in large modern buildings. Though widely implemented, the thorough study of atrium performance is restricted due to its large size, complex thermodynamic behavior and the inaccuracies and limitations of available prediction tools. This study reviews the most common research tools implemented in previous researches on atria thermal performance, to explore the advantages and limitation of different methods for future studies. The methods reviewed are analytical, experimental, computer modelling and a combination of any or all of these methods. The findings showed that CFD (computational fluid dynamic models are the most popular tools of recent due to their higher accuracy, capabilities and user-friendly modification. Although the experimental methods were reliable for predicting atria thermal and ventilation performance, they have mostly been used to provide data for validation of CFD models. Furthermore, coupling CFD with other experimental models could increase the reliability and accuracy of the models and provide a more comprehensive analysis.

  15. Damage approach: A new method for topology optimization with local stress constraints

    DEFF Research Database (Denmark)

    Verbart, Alexander; Langelaar, Matthijs; van Keulen, Fred

    2016-01-01

    In this paper, we propose a new method for topology optimization with local stress constraints. In this method, material in which a stress constraint is violated is considered as damaged. Since damaged material will contribute less to the overall performance of the structure, the optimizer...... will promote a design with a minimal amount of damaged material. We tested the method on several benchmark problems, and the results show that the method is a viable alternative for conventional stress-based approaches based on constraint relaxation followed by constraint aggregation....

  16. Methodical Approaches to Communicative Providing of Retailer Branding

    Directory of Open Access Journals (Sweden)

    Andrey Kataev

    2017-07-01

    Full Text Available The thesis is devoted to the rationalization of methodical approaches for provision of branding of retail trade enterprises. The article considers the features of brand perception by retail consumers and clarifies the specifics of customer reviews of stores for the procedures accompanying brand management. It is proved that besides traditional communication mix, the most important tool of communicative influence on buyers is the store itself as a place for comfortable shopping. The shop should have a stimulating effect on all five human senses, including sight, smell, hearing, touch, and taste, which shall help maximize consumer integration into the buying process.

  17. Reasoning methods in medical consultation systems: artificial intelligence approaches.

    Science.gov (United States)

    Shortliffe, E H

    1984-01-01

    It has been argued that the problem of medical diagnosis is fundamentally ill-structured, particularly during the early stages when the number of possible explanations for presenting complaints can be immense. This paper discusses the process of clinical hypothesis evocation, contrasts it with the structured decision making approaches used in traditional computer-based diagnostic systems, and briefly surveys the more open-ended reasoning methods that have been used in medical artificial intelligence (AI) programs. The additional complexity introduced when an advice system is designed to suggest management instead of (or in addition to) diagnosis is also emphasized. Example systems are discussed to illustrate the key concepts.

  18. Operator performance evaluation using multi criteria decision making methods

    Science.gov (United States)

    Rani, Ruzanita Mat; Ismail, Wan Rosmanira; Razali, Siti Fatihah

    2014-06-01

    Operator performance evaluation is a very important operation in labor-intensive manufacturing industry because the company's productivity depends on the performance of its operators. The aims of operator performance evaluation are to give feedback to operators on their performance, to increase company's productivity and to identify strengths and weaknesses of each operator. In this paper, six multi criteria decision making methods; Analytical Hierarchy Process (AHP), fuzzy AHP (FAHP), ELECTRE, PROMETHEE II, Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR) are used to evaluate the operators' performance and to rank the operators. The performance evaluation is based on six main criteria; competency, experience and skill, teamwork and time punctuality, personal characteristics, capability and outcome. The study was conducted at one of the SME food manufacturing companies in Selangor. From the study, it is found that AHP and FAHP yielded the "outcome" criteria as the most important criteria. The results of operator performance evaluation showed that the same operator is ranked the first using all six methods.

  19. Critical factors in the empirical performance of temporal difference and evolutionary methods for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Taylor, M.E.; Stone, P.

    2010-01-01

    Temporal difference and evolutionary methods are two of the most common approaches to solving reinforcement learning problems. However, there is little consensus on their relative merits and there have been few empirical studies that directly compare their performance. This article aims to address

  20. Common genetic variants associated with cognitive performance identified using the proxy-phenotype method

    NARCIS (Netherlands)

    C.A. Rietveld (Niels); T. Esko (Tõnu); G. Davies (Gail); T.H. Pers (Tune); P. Turley (Patrick); B. Benyamin (Beben); C.F. Chabris (Christopher F.); V. Emilsson (Valur); A.D. Johnson (Andrew); J.J. Lee (James J.); C. de Leeuw (Christiaan); R.E. Marioni (Riccardo); S.E. Medland (Sarah Elizabeth); M. Miller (Mike); O. Rostapshova (Olga); S.J. van der Lee (Sven); A.A.E. Vinkhuyzen (Anna A.); N. Amin (Najaf); D. Conley (Dalton); J. Derringer; C.M. van Duijn (Cornelia); R.S.N. Fehrmann (Rudolf); L. Franke (Lude); E.L. Glaeser (Edward L.); N.K. Hansell (Narelle); C. Hayward (Caroline); W.G. Iacono (William); C.A. Ibrahim-Verbaas (Carla); V.W.V. Jaddoe (Vincent); J. Karjalainen (Juha); D. Laibson (David); P. Lichtenstein (Paul); D.C. Liewald (David C.); P.K. Magnusson (Patrik); N.G. Martin (Nicholas); M. McGue (Matt); G. Mcmahon (George); N.L. Pedersen (Nancy); S. Pinker (Steven); D.J. Porteous (David J.); D. Posthuma (Danielle); F. Rivadeneira Ramirez (Fernando); B.H. Smithk (Blair H.); J.M. Starr (John); H.W. Tiemeier (Henning); N.J. Timpsonm (Nicholas J.); M. Trzaskowskin (Maciej); A.G. Uitterlinden (André); F.C. Verhulst (Frank); M.E. Ward (Mary); M.J. Wright (Margaret); G.D. Smith; I.J. Deary (Ian J.); M. Johannesson (Magnus); R. Plomin (Robert); P.M. Visscher (Peter); D.J. Benjamin (Daniel J.); D. Cesarini (David); Ph.D. Koellinger (Philipp)

    2014-01-01

    textabstractWe identify common genetic variants associated with cognitive performance using a two-stage approach, which we call the proxyphenotype method. First, we conduct a genome-wide association study of educational attainment in a large sample (n = 106,736), which produces a set of 69

  1. A geologic approach to field methods in fluvial geomorphology

    Science.gov (United States)

    Fitzpatrick, Faith A.; Thornbush, Mary J; Allen, Casey D; Fitzpatrick, Faith A.

    2014-01-01

    A geologic approach to field methods in fluvial geomorphology is useful for understanding causes and consequences of past, present, and possible future perturbations in river behavior and floodplain dynamics. Field methods include characterizing river planform and morphology changes and floodplain sedimentary sequences over long periods of time along a longitudinal river continuum. Techniques include topographic and bathymetric surveying of fluvial landforms in valley bottoms and describing floodplain sedimentary sequences through coring, trenching, and examining pits and exposures. Historical sediment budgets that include floodplain sedimentary records can characterize past and present sources and sinks of sediment along a longitudinal river continuum. Describing paleochannels and floodplain vertical accretion deposits, estimating long-term sedimentation rates, and constructing historical sediment budgets can assist in management of aquatic resources, habitat, sedimentation, and flooding issues.

  2. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba; Bajic, Vladimir B.

    2016-01-01

    decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we

  3. METHODICAL APPROACH TO AN ESTIMATION OF PROFESSIONALISM OF AN EMPLOYEE

    Directory of Open Access Journals (Sweden)

    Татьяна Александровна Коркина

    2013-08-01

    Full Text Available Analysis of definitions of «professionalism», reflecting the different viewpoints of scientists and practitioners, has shown that it is interpreted as a specific property of the people effectively and reliably carry out labour activity in a variety of conditions. The article presents the methodical approach to an estimation of professionalism of the employee from the position as the external manifestations of the reliability and effectiveness of the work and the position of the personal characteristics of the employee, determining the results of his work. This approach includes the assessment of the level of qualification and motivation of the employee for each key job functions as well as the final results of its implementation on the criteria of efficiency and reliability. The proposed methodological approach to the estimation of professionalism of the employee allows to identify «bottlenecks» in the structure of its labour functions and to define directions of development of the professional qualities of the worker to ensure the required level of reliability and efficiency of the obtained results.DOI: http://dx.doi.org/10.12731/2218-7405-2013-6-11

  4. Performance Analysis of Unsupervised Clustering Methods for Brain Tumor Segmentation

    Directory of Open Access Journals (Sweden)

    Tushar H Jaware

    2013-10-01

    Full Text Available Medical image processing is the most challenging and emerging field of neuroscience. The ultimate goal of medical image analysis in brain MRI is to extract important clinical features that would improve methods of diagnosis & treatment of disease. This paper focuses on methods to detect & extract brain tumour from brain MR images. MATLAB is used to design, software tool for locating brain tumor, based on unsupervised clustering methods. K-Means clustering algorithm is implemented & tested on data base of 30 images. Performance evolution of unsupervised clusteringmethods is presented.

  5. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    International Nuclear Information System (INIS)

    Shafii, Mohammad Ali; Meidianti, Rahma; Wildian,; Fitriyani, Dian; Tongkukut, Seni H. J.; Arkundato, Artoto

    2014-01-01

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation

  6. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    Energy Technology Data Exchange (ETDEWEB)

    Shafii, Mohammad Ali, E-mail: mashafii@fmipa.unand.ac.id; Meidianti, Rahma, E-mail: mashafii@fmipa.unand.ac.id; Wildian,, E-mail: mashafii@fmipa.unand.ac.id; Fitriyani, Dian, E-mail: mashafii@fmipa.unand.ac.id [Department of Physics, Andalas University Padang West Sumatera Indonesia (Indonesia); Tongkukut, Seni H. J. [Department of Physics, Sam Ratulangi University Manado North Sulawesi Indonesia (Indonesia); Arkundato, Artoto [Department of Physics, Jember University Jember East Java Indonesia (Indonesia)

    2014-09-30

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation.

  7. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  8. On dynamical systems approaches and methods in f ( R ) cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Alho, Artur [Center for Mathematical Analysis, Geometry and Dynamical Systems, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Carloni, Sante [Centro Multidisciplinar de Astrofisica – CENTRA, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Uggla, Claes, E-mail: aalho@math.ist.utl.pt, E-mail: sante.carloni@tecnico.ulisboa.pt, E-mail: claes.uggla@kau.se [Department of Physics, Karlstad University, S-65188 Karlstad (Sweden)

    2016-08-01

    We discuss dynamical systems approaches and methods applied to flat Robertson-Walker models in f ( R )-gravity. We argue that a complete description of the solution space of a model requires a global state space analysis that motivates globally covering state space adapted variables. This is shown explicitly by an illustrative example, f ( R ) = R + α R {sup 2}, α > 0, for which we introduce new regular dynamical systems on global compactly extended state spaces for the Jordan and Einstein frames. This example also allows us to illustrate several local and global dynamical systems techniques involving, e.g., blow ups of nilpotent fixed points, center manifold analysis, averaging, and use of monotone functions. As a result of applying dynamical systems methods to globally state space adapted dynamical systems formulations, we obtain pictures of the entire solution spaces in both the Jordan and the Einstein frames. This shows, e.g., that due to the domain of the conformal transformation between the Jordan and Einstein frames, not all the solutions in the Jordan frame are completely contained in the Einstein frame. We also make comparisons with previous dynamical systems approaches to f ( R ) cosmology and discuss their advantages and disadvantages.

  9. Performance of trim coils made by a novel method

    International Nuclear Information System (INIS)

    Wanderer, P.; Anerella, M.; Cottingham, J.; Ganetis, G.; Garber, M.; Ghosh, A.; Goodzeit, C.; Greene, A.; Gupta, R.; Herrera, J.; Kahn, S.; Kelly, E.; Meade, A.; Morgan, G.; Muratore, J.; Prodell, A.; Rehak, M.; Rohrer, E.P.; Sampson, W.; Shutt, R.; Skaritka, J.; Thompson, P.; Willen, E.

    1991-01-01

    A precision, automated method of manufacturing trim coils based on printed circuit technology has been developed. Excellent quench performance and increased radiation resistance have been achieved in recently-tested models of sextupole trim coils developed for operation inside 40 mm-aperture SSC Main Collider dipoles. 6 refs., 2 figs

  10. Sensitive high performance liquid chromatographic method for the ...

    African Journals Online (AJOL)

    A new simple, sensitive, cost-effective and reproducible high performance liquid chromatographic (HPLC) method for the determination of proguanil (PG) and its metabolites, cycloguanil (CG) and 4-chlorophenylbiguanide (4-CPB) in urine and plasma is described. The extraction procedure is a simple three-step process ...

  11. Visual art teachers and performance assessment methods in ...

    African Journals Online (AJOL)

    This paper examines the competencies of visual arts teachers in using performance assessment methods, and to ascertain the extent to which the knowledge, skills and experiences of teachers affect their competence in using assessment strategies in their classroom. The study employs a qualitative research design; ...

  12. Methods of evaluating performance in controlling marketing,activities

    OpenAIRE

    Codruţa Dura

    2002-01-01

    There are specific methods for assessing and improving the effectiveness of a marketing strategy. A marketer should state in the marketing plan what a marketing strategy is supposed to accomplish. These statements should set forth performance standards, which usually are stated in terms of profits, sales, or costs

  13. Improvement on the Performance of Canal Network and Method of ...

    African Journals Online (AJOL)

    This paper presents the required improvement on the performance of canal network and method of on-farm water application systems at Tunga-Kawo irrigation scheme, Wushishi, Niger state. The problems of poor delivery of water to the farmland were identified to include erosion of canal embarkment, lack of water ...

  14. Data-driven performance evaluation method for CMS RPC trigger ...

    Indian Academy of Sciences (India)

    2012-10-06

    Oct 6, 2012 ... hardware-implemented algorithm, which performs the task of combining and merging information from muon ... Figure 1 shows the comparison of efficiencies obtained with the two methods containing .... [3] The CMS Collaboration, The trigger and data acquisition project, Volume 1, The Level 1. Trigger ...

  15. Performance Poetry as a Method to Understand Disability

    Directory of Open Access Journals (Sweden)

    Lee-Ann Fenge

    2016-03-01

    Full Text Available The Seen but Seldom Heard project was a performative social science (PSS project which used performance poetry to illuminate the experiences of young people with physical impairments. Two performance poets, a group of young people with physical impairments, and academics from social science and media/communication backgrounds worked together to explore various aspects of the lived experience of disability exploring issues associated with identity, stereotypes, stigma and representation. In this article, we will present an overview of the project and consider how PSS offers a method to engage seldom heard voices, and illustrate this through two poems which shed light on the lived experience of disability. The article will consider the impact of these poems as PSS, and how this method allows the audience to develop a deeper understanding of the "lived" experience of disability and to reflect upon their own understandings of disability and discrimination. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1602118

  16. Ultra-high performance liquid chromatography coupled with photo-diode array and quadrupole/time-of-flight mass spectrometry based chemical profiling approach to evaluate the influence of preparation methods on the holistic quality of Qiong-Yu-Gao, a traditional complex herbal medicine.

    Science.gov (United States)

    Xu, Jin-Di; Mao, Qian; Shen, Hong; Zhu, Ling-Ying; Li, Song-Lin; Yan, Ru

    2013-08-23

    Qiong-Yu-Gao (QYG), consisting of Rehmanniae Radix (RR), Poriae (PO) and Ginseng Radix (GR), is a commonly used tonic traditional complex herbal medicine (CHM). So far, three different methods have been documented for preparation of QYG, i.e. method 1 (M1): mixing powders of GR and PO with decoction of RR; method 2 (M2): combining the decoction of RR and PO with the decoction of GR; method 3 (M3): decocting the mixture of RR, GR and PO. In present study, an ultra-high performance liquid chromatography coupled with photo-diode array and quadrupole/time-of-flight mass spectrometry (UHPLC-PDA-QTOF-MS/MS) based chemical profiling approach was developed to investigate the influence of the three preparation methods on the holistic quality of QYG. All detected peaks were unambiguously identified by comparing UV spectra, accurate mass data/characteristic mass fragments and retention times with those of reference compounds, and/or tentatively assigned by matching empirical molecular formula with that of known compounds, and/or elucidating quasi-molecular ions and fragment ions referring to information available in literature. A total of 103 components, mainly belonging to ginsenosides, phenethylalcohol glycosides, iridoid glycosides and triterpenoid acids, were identified, of which 5 degraded ginsenosides were putatively determined to be newly generated during preparation procedures of QYG samples. Triterpenoid acids and malonyl-ginsenosides were detected only in M1 samples, while degraded ginsenosides were merely detectable in M2/M3 samples. The possible reasons for the difference among chemical profiles of QYG samples prepared with three methods were also discussed. It could be concluded that preparation method do significantly affect the holistic quality of QYG. The influence of the altered chemical profiles on the bioactivity of QYG needs further investigation. The present study demonstrated that UHPLC-PDA-QTOF-MS/MS based chemical profiling approach is efficient and

  17. Evaluation method for the drying performance of enzyme containing formulations

    DEFF Research Database (Denmark)

    Sloth, Jakob; Bach, P.; Jensen, Anker Degn

    2008-01-01

    A method is presented for fast and cheap evaluation of the performance of enzyme containing formulations in terms of preserving the highest enzyme activity during spray drying. The method is based on modeling the kinetics of the thermal inactivation reaction which occurs during the drying process....... Relevant kinetic parameters are determined from differential scanning calorimeter (DSC) experiments and the model is used to simulate the severity of the inactivation reaction for temperatures and moisture levels relevant for spray drying. After conducting experiments and subsequent simulations...... for a number of different formulations it may be deduced which formulation performs best. This is illustrated by a formulation design study where 4 different enzyme containing formulations are evaluated. The method is validated by comparison to pilot scale spray dryer experiments....

  18. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Energy Technology Data Exchange (ETDEWEB)

    Senvar, O.; Sennaroglu, B.

    2016-07-01

    This study examines Clements’ Approach (CA), Box-Cox transformation (BCT), and Johnson transformation (JT) methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI) Ppu is handled for process capability analysis (PCA) because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD), which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB) and the Relative Root Mean Square Error (RRMSE) are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations... (Author)

  19. Chemical profiling approach to evaluate the influence of traditional and simplified decoction methods on the holistic quality of Da-Huang-Xiao-Shi decoction using high-performance liquid chromatography coupled with diode-array detection and time-of-flight mass spectrometry.

    Science.gov (United States)

    Yan, Xuemei; Zhang, Qianying; Feng, Fang

    2016-04-01

    Da-Huang-Xiao-Shi decoction, consisting of Rheum officinale Baill, Mirabilitum, Phellodendron amurense Rupr. and Gardenia jasminoides Ellis, is a traditional Chinese medicine used for the treatment of jaundice. As described in "Jin Kui Yao Lue", a traditional multistep decoction of Da-Huang-Xiao-Shi decoction was required while simplified one-step decoction was used in recent repsorts. To investigate the chemical difference between the decoctions obtained by the traditional and simplified preparations, a sensitive and reliable approach of high-performance liquid chromatography coupled with diode-array detection and electrospray ionization time-of-flight mass spectrometry was established. As a result, a total of 105 compounds were detected and identified. Analysis of the chromatogram profiles of the two decoctions showed that many compounds in the decoction of simplified preparation had changed obviously compared with those in traditional preparation. The changes of constituents would be bound to cause the differences in the therapeutic effects of the two decoctions. The present study demonstrated that certain preparation methods significantly affect the holistic quality of traditional Chinese medicines and the use of a suitable preparation method is crucial for these medicines to produce special clinical curative effect. This research results elucidated the scientific basis of traditional preparation methods in Chinese medicines. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Approaches to Studying and Academic Performance in Short Essay Exams

    Science.gov (United States)

    Minbashian, Amirali; Huon, Gail F.; Bird, Kevin D.

    2004-01-01

    Previous research has generally failed to find a relation between the way students approach the task of studying and their exam grades. The present study investigated why it is that a deep approach to studying, which has been shown to result in a higher quality of learning, does not consistently result in higher exam grades. The participants in…

  1. IS ENVIRONMENTAL ALIGNMENT AND BUSINESS PERFORMANCE: A CONCEPTUAL APPROACH

    Directory of Open Access Journals (Sweden)

    K. Garg

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: This paper proposes a conceptual model termed "IS environmental alignment" that focuses on the support provided by IS strategy to minimize the gap between perceived environmental uncertainty and realized/objective environmental conditions. The model uses the Chan et al [9] alignment measurement method to measure IS strategic alignment as it provides a quantitative measure. In due course the proposed model would be tested in industry and would examine the affect of IS environmental alignment on business performance. The implication of the model lies in the effective use of deployed IS systems by organizations.

    AFRIKAANSE OPSOMMING: 'n Konsepmodel word voorgelê wat handel oor "IS-omgewingsaanpassing". IS-strategie-ondersteuning by die minimisering van die gaping tussen waargenome omgewingsonsekerheid en gerealiseerde/objektiewe omgewingstoestande. Die model maak gebruik van die Chan et al [9] aanpassingsmeetmetode om IS-strategie-aanpassing op 'n kwantitatiewe basis te bepaal. Met die verloop van tyd sal die konsepmodel in die praktyk getoets word om te toon hoe IS-strategie-aanpassing sakevertoning affekteer. Die effektiewe ontplooiing van IS-stelsels by ondernemings word voorgehou.

  2. Proposal for an Evaluation Method for the Performance of Work Procedures.

    Science.gov (United States)

    Mohammed, Mouda; Mébarek, Djebabra; Wafa, Boulagouas; Makhlouf, Chati

    2016-12-01

    Noncompliance of operators with work procedures is a recurrent problem. This human behavior has been said to be situational and studied by many different approaches (ergonomic and others), which consider the noncompliance with work procedures to be obvious and seek to analyze its causes as well as consequences. The object of the proposed method is to solve this problem by focusing on the performance of work procedures and ensuring improved performance on a continuous basis. This study has multiple results: (1) assessment of the work procedures' performance by a multicriteria approach; (2) the use of a continuous improvement approach as a framework for the sustainability of the assessment method of work procedures' performance; and (3) adaptation of the Stop-Card as a facilitator support for continuous improvement of work procedures. The proposed method emphasizes to put in value the inputs of continuous improvement of the work procedures in relation with the conventional approaches which adopt the obvious evidence of the noncompliance to the working procedures and seek to analyze the cause-effect relationships related to this unacceptable phenomenon, especially in strategic industry.

  3. Electrostatic Discharge Current Linear Approach and Circuit Design Method

    Directory of Open Access Journals (Sweden)

    Pavlos K. Katsivelis

    2010-11-01

    Full Text Available The Electrostatic Discharge phenomenon is a great threat to all electronic devices and ICs. An electric charge passing rapidly from a charged body to another can seriously harm the last one. However, there is a lack in a linear mathematical approach which will make it possible to design a circuit capable of producing such a sophisticated current waveform. The commonly accepted Electrostatic Discharge current waveform is the one set by the IEC 61000-4-2. However, the over-simplified circuit included in the same standard is incapable of producing such a waveform. Treating the Electrostatic Discharge current waveform of the IEC 61000-4-2 as reference, an approximation method, based on Prony’s method, is developed and applied in order to obtain a linear system’s response. Considering a known input, a method to design a circuit, able to generate this ESD current waveform in presented. The circuit synthesis assumes ideal active elements. A simulation is carried out using the PSpice software.

  4. Performance evaluation of sea surface simulation methods for target detection

    Science.gov (United States)

    Xia, Renjie; Wu, Xin; Yang, Chen; Han, Yiping; Zhang, Jianqi

    2017-11-01

    With the fast development of sea surface target detection by optoelectronic sensors, machine learning has been adopted to improve the detection performance. Many features can be learned from training images by machines automatically. However, field images of sea surface target are not sufficient as training data. 3D scene simulation is a promising method to address this problem. For ocean scene simulation, sea surface height field generation is the key point to achieve high fidelity. In this paper, two spectra-based height field generation methods are evaluated. Comparison between the linear superposition and linear filter method is made quantitatively with a statistical model. 3D ocean scene simulating results show the different features between the methods, which can give reference for synthesizing sea surface target images with different ocean conditions.

  5. Modeling the Performance of Fast Mulipole Method on HPC platforms

    KAUST Repository

    Ibeid, Huda

    2012-04-06

    The current trend in high performance computing is pushing towards exascale computing. To achieve this exascale performance, future systems will have between 100 million and 1 billion cores assuming gigahertz cores. Currently, there are many efforts studying the hardware and software bottlenecks for building an exascale system. It is important to understand and meet these bottlenecks in order to attain 10 PFLOPS performance. On applications side, there is an urgent need to model application performance and to understand what changes need to be made to ensure continued scalability at this scale. Fast multipole methods (FMM) were originally developed for accelerating N-body problems for particle based methods. Nowadays, FMM is more than an N-body solver, recent trends in HPC have been to use FMMs in unconventional application areas. FMM is likely to be a main player in exascale due to its hierarchical nature and the techniques used to access the data via a tree structure which allow many operations to happen simultaneously at each level of the hierarchy. In this thesis , we discuss the challenges for FMM on current parallel computers and future exasclae architecture. Furthermore, we develop a novel performance model for FMM. Our ultimate aim of this thesis is to ensure the scalability of FMM on the future exascale machines.

  6. System and Method for Monitoring Piezoelectric Material Performance

    Science.gov (United States)

    Moses, Robert W. (Inventor); Fox, Christopher L. (Inventor); Fox, Melanie L. (Inventor); Chattin, Richard L. (Inventor); Shams, Qamar A. (Inventor); Fox, Robert L. (Inventor)

    2007-01-01

    A system and method are provided for monitoring performance capacity of a piezoelectric material that may form part of an actuator or sensor device. A switch is used to selectively electrically couple an inductor to the piezoelectric material to form an inductor-capacitor circuit. Resonance is induced in the inductor-capacitor circuit when the switch is operated to create the circuit. The resonance of the inductor-capacitor circuit is monitored with the frequency of the resonance being indicative of performance capacity of the device's piezoelectric material.

  7. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    DEFF Research Database (Denmark)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna

    2013-01-01

    machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region......-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods...

  8. A general method for assessing brain-computer interface performance and its limitations

    Science.gov (United States)

    Hill, N. Jeremy; Häuser, Ann-Katrin; Schalk, Gerwin

    2014-04-01

    Objective. When researchers evaluate brain-computer interface (BCI) systems, we want quantitative answers to questions such as: How good is the system’s performance? How good does it need to be? and: Is it capable of reaching the desired level in future? In response to the current lack of objective, quantitative, study-independent approaches, we introduce methods that help to address such questions. We identified three challenges: (I) the need for efficient measurement techniques that adapt rapidly and reliably to capture a wide range of performance levels; (II) the need to express results in a way that allows comparison between similar but non-identical tasks; (III) the need to measure the extent to which certain components of a BCI system (e.g. the signal processing pipeline) not only support BCI performance, but also potentially restrict the maximum level it can reach. Approach. For challenge (I), we developed an automatic staircase method that adjusted task difficulty adaptively along a single abstract axis. For challenge (II), we used the rate of information gain between two Bernoulli distributions: one reflecting the observed success rate, the other reflecting chance performance estimated by a matched random-walk method. This measure includes Wolpaw’s information transfer rate as a special case, but addresses the latter’s limitations including its restriction to item-selection tasks. To validate our approach and address challenge (III), we compared four healthy subjects’ performance using an EEG-based BCI, a ‘Direct Controller’ (a high-performance hardware input device), and a ‘Pseudo-BCI Controller’ (the same input device, but with control signals processed by the BCI signal processing pipeline). Main results. Our results confirm the repeatability and validity of our measures, and indicate that our BCI signal processing pipeline reduced attainable performance by about 33% (21 bits min-1). Significance. Our approach provides a flexible basis

  9. Composite Measures of Health Care Provider Performance: A Description of Approaches

    Science.gov (United States)

    Shwartz, Michael; Restuccia, Joseph D; Rosen, Amy K

    2015-01-01

    Context Since the Institute of Medicine’s 2001 report Crossing the Quality Chasm, there has been a rapid proliferation of quality measures used in quality-monitoring, provider-profiling, and pay-for-performance (P4P) programs. Although individual performance measures are useful for identifying specific processes and outcomes for improvement and tracking progress, they do not easily provide an accessible overview of performance. Composite measures aggregate individual performance measures into a summary score. By reducing the amount of data that must be processed, they facilitate (1) benchmarking of an organization’s performance, encouraging quality improvement initiatives to match performance against high-performing organizations, and (2) profiling and P4P programs based on an organization’s overall performance. Methods We describe different approaches to creating composite measures, discuss their advantages and disadvantages, and provide examples of their use. Findings The major issues in creating composite measures are (1) whether to aggregate measures at the patient level through all-or-none approaches or the facility level, using one of the several possible weighting schemes; (2) when combining measures on different scales, how to rescale measures (using z scores, range percentages, ranks, or 5-star categorizations); and (3) whether to use shrinkage estimators, which increase precision by smoothing rates from smaller facilities but also decrease transparency. Conclusions Because provider rankings and rewards under P4P programs may be sensitive to both context and the data, careful analysis is warranted before deciding to implement a particular method. A better understanding of both when and where to use composite measures and the incentives created by composite measures are likely to be important areas of research as the use of composite measures grows. PMID:26626986

  10. Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice

    Science.gov (United States)

    Bolden, Marsha Gail

    Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.

  11. Methodical approach to financial stimulation of logistics managers

    Directory of Open Access Journals (Sweden)

    Melnykova Kateryna V.

    2014-01-01

    Full Text Available The article offers a methodical approach to financial stimulation of logistics managers, which allows calculation of the incentive amount with consideration of profit obtained from introduction of optimisation logistics solutions. The author generalises measures, which would allow increase of stimulation of labour of logistics managers by the enterprise top managers. The article marks out motivation factors, which exert influence upon relation of logistics managers to execution of optimisation logistical solutions, which minimise logistical costs. The author builds a scale of financial encouragement for introduction of optimisation logistical solutions proposed by logistics managers. This scale is basic for functioning of the encouragement system and influences the increase of efficiency of logistics managers operation and also optimisation of enterprise logistical solutions.

  12. Clinical Performance of a Combined Approach for the Esthetic ...

    African Journals Online (AJOL)

    2017-09-14

    Sep 14, 2017 ... leads to mild to severe esthetic problems requiring esthetic ... esthetic management of dental fluorosis, ranging from bleaching ... approaches such involving the use of composite or ceramic .... smoking or poor dental health.

  13. An approach for evaluating expert performance in emergency situations

    International Nuclear Information System (INIS)

    Ujita, Hiroshi; Kawano, Ryutaro; Yoshimura, Sandanori

    1995-01-01

    To understand expert behavior and define what constitutes good performance in emergency situations in huge and complex plants, human performance evaluation should be made from viewpoints of not only error, but also various cognitive, psychological, and behavioral characteristics. Quantitative and qualitative measures of human performance are proposed for both individual operators and crews, based on the operator performance analysis experiment, among which cognitive and behavioral aspects are the most important. Operator performance should be further analyzed experimentally from the cognitive and behavioral viewpoints, using an evaluation based on various gross indexes considering operator's tasks which should be done in response to plant situations

  14. An efficient method for evaluating RRAM crossbar array performance

    Science.gov (United States)

    Song, Lin; Zhang, Jinyu; Chen, An; Wu, Huaqiang; Qian, He; Yu, Zhiping

    2016-06-01

    An efficient method is proposed in this paper to mitigate computational burden in resistive random access memory (RRAM) array simulation. In the worst case scenario, a 4 Mb RRAM array with line resistance is greatly reduced using this method. For 1S1R-RRAM array structures, static and statistical parameters in both reading and writing processes are simulated. Error analysis is performed to prove the reliability of the algorithm when line resistance is extremely small compared with the junction resistance. Results show that high precision is maintained even if the size of RRAM array is reduced by one thousand times, which indicates significant improvements in both computational efficiency and memory requirements.

  15. Employment of kernel methods on wind turbine power performance assessment

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.

    2015-01-01

    A power performance assessment technique is developed for the detection of power production discrepancies in wind turbines. The method employs a widely used nonparametric pattern recognition technique, the kernel methods. The evaluation is based on the trending of an extracted feature from...... the kernel matrix, called similarity index, which is introduced by the authors for the first time. The operation of the turbine and consequently the computation of the similarity indexes is classified into five power bins offering better resolution and thus more consistent root cause analysis. The accurate...

  16. An integrated approach for facilities planning by ELECTRE method

    Science.gov (United States)

    Elbishari, E. M. Y.; Hazza, M. H. F. Al; Adesta, E. Y. T.; Rahman, Nur Salihah Binti Abdul

    2018-01-01

    Facility planning is concerned with the design, layout, and accommodation of people, machines and activities of a system. Most of the researchers try to investigate the production area layout and the related facilities. However, few of them try to investigate the relationship between the production space and its relationship with service departments. The aim of this research to is to integrate different approaches in order to evaluate, analyse and select the best facilities planning method that able to explain the relationship between the production area and other supporting departments and its effect on human efforts. To achieve the objective of this research two different approaches have been integrated: Apple’s layout procedure as one of the effective tools in planning factories, ELECTRE method as one of the Multi Criteria Decision Making methods (MCDM) to minimize the risk of getting poor facilities planning. Dalia industries have been selected as a case study to implement our integration the factory have been divided two main different area: the whole facility (layout A), and the manufacturing area (layout B). This article will be concerned with the manufacturing area layout (Layout B). After analysing the data gathered, the manufacturing area was divided into 10 activities. There are five factors that the alternative were compared upon which are: Inter department satisfactory level, total distance travelled for workers, total distance travelled for the product, total time travelled for the workers, and total time travelled for the product. Three different layout alternatives have been developed in addition to the original layouts. Apple’s layout procedure was used to study and evaluate the different alternatives layouts, the study and evaluation of the layouts was done by calculating scores for each of the factors. After obtaining the scores from evaluating the layouts, ELECTRE method was used to compare the proposed alternatives with each other and with

  17. Approaches for Making High Performance Polymer Materials from Commodity Polymers

    Institute of Scientific and Technical Information of China (English)

    Xu Xi

    2004-01-01

    A brief surrey of ongoing research work done for improving and enhancing the properties of commodity polymers by the author and author's colleagues is given in this paper. A series of high performance polymers and polymer nanomaterials were successfully prepared through irradiation and stress-induced reactions of polymers and hydrogen bonding. The methods proposed are viable, easy in operation, clean and efficient.1. The effect of irradiation source (UV light, electron beam, γ -ray and microwave), irradiation dose, irradiation time and atmosphere etc. on molecular structure of polyolefine during irradiation was studied. The basic rules of dominating oxidation, degradation and cross-linking reactions were mastered. Under the controlled conditions, cross-linking reactions are prevented, some oxygen containing groups are introduced on the molecular chain of polyolefine to facilitate the interface compatibility of their blends. A series of high performance polymer materials: u-HDPE/PA6,u-HDPE/CaCO3, u-iPP/STC, γ-HDPE/STC, γ-LLDPE/ATH, e-HDPE, e-LLDPE and m-HDPEfilled system were prepared (u- ultraviolet light irradiated, γ- γ-ray irradiated, e- electron beam irradiated, m- microwave irradiated)2. The effect of ultrasonic irradiation, jet and pan-milling on structure and changes in properties of polymers were studied. Imposition of critical stress on polymer chain can cause the scission of bonds to form macroradicals. The macroradicals formed in this way may recombine or react with monomer or other radicals to form linear, branched or cross-linked polymers or copolymers. About 20 kinds of block/graft copolymers have been synthesized from polymer-polymer or polymer-monomer through ultrasonic irradiation.Through jet-milling, the molecular weight of PVC is decreased somewhat, the intensity of its crystalline absorption bonds becomes indistinct. The processability, the yield strength, strength at break and elongation at break of PVC get increased quite a lot after

  18. ACCOUNTING STUDENT’S LEARNING APPROACHES AND IMPACT ON ACADEMIC PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Suhaiza Ismail

    2009-12-01

    Full Text Available The objective of the study is threefold. Firstly, the study explores the learning approaches adopted by students in completing their Business Finance. Secondly, it examines the impact that learning approaches has on the student’s academic performance. Finally, the study considers gender differences in the learning approaches adopted by students and in the relationship between learning approaches and academic performance. The Approaches and Study Skills Inventory for Students (ASSIST was used to assess the approaches to learning adopted by students whilst the students final examination result was considered in examining the performance of the students. The results indicate that majority of the accounting students, both male andfemale groups prefer to use the deep approach in studying Business Finance. The findings also reveal that there were significant relationships between learning approaches and academic performance with positive direction appears for deep and strategic approaches whilst negative relationship reveals for surface approach.

  19. Optimization of cooling tower performance analysis using Taguchi method

    Directory of Open Access Journals (Sweden)

    Ramkumar Ramakrishnan

    2013-01-01

    Full Text Available This study discuss the application of Taguchi method in assessing maximum cooling tower effectiveness for the counter flow cooling tower using expanded wire mesh packing. The experiments were planned based on Taguchi’s L27 orthogonal array .The trail was performed under different inlet conditions of flow rate of water, air and water temperature. Signal-to-noise ratio (S/N analysis, analysis of variance (ANOVA and regression were carried out in order to determine the effects of process parameters on cooling tower effectiveness and to identity optimal factor settings. Finally confirmation tests verified this reliability of Taguchi method for optimization of counter flow cooling tower performance with sufficient accuracy.

  20. The application of advanced rotor (performance) methods for design calculations

    Energy Technology Data Exchange (ETDEWEB)

    Bussel, G.J.W. van [Delft Univ. of Technology, Inst. for Wind Energy, Delft (Netherlands)

    1997-08-01

    The calculation of loads and performance of wind turbine rotors has been a topic for research over the last century. The principles for the calculation of loads on rotor blades with a given specific geometry, as well as the development of optimal shaped rotor blades have been published in the decades that significant aircraft development took place. Nowadays advanced computer codes are used for specific problems regarding modern aircraft, and application to wind turbine rotors has also been performed occasionally. The engineers designing rotor blades for wind turbines still use methods based upon global principles developed in the beginning of the century. The question what to expect in terms of the type of methods to be applied in a design environment for the near future is addressed here. (EG) 14 refs.

  1. Advanced methods in NDE using machine learning approaches

    Science.gov (United States)

    Wunderlich, Christian; Tschöpe, Constanze; Duckhorn, Frank

    2018-04-01

    Machine learning (ML) methods and algorithms have been applied recently with great success in quality control and predictive maintenance. Its goal to build new and/or leverage existing algorithms to learn from training data and give accurate predictions, or to find patterns, particularly with new and unseen similar data, fits perfectly to Non-Destructive Evaluation. The advantages of ML in NDE are obvious in such tasks as pattern recognition in acoustic signals or automated processing of images from X-ray, Ultrasonics or optical methods. Fraunhofer IKTS is using machine learning algorithms in acoustic signal analysis. The approach had been applied to such a variety of tasks in quality assessment. The principal approach is based on acoustic signal processing with a primary and secondary analysis step followed by a cognitive system to create model data. Already in the second analysis steps unsupervised learning algorithms as principal component analysis are used to simplify data structures. In the cognitive part of the software further unsupervised and supervised learning algorithms will be trained. Later the sensor signals from unknown samples can be recognized and classified automatically by the algorithms trained before. Recently the IKTS team was able to transfer the software for signal processing and pattern recognition to a small printed circuit board (PCB). Still, algorithms will be trained on an ordinary PC; however, trained algorithms run on the Digital Signal Processor and the FPGA chip. The identical approach will be used for pattern recognition in image analysis of OCT pictures. Some key requirements have to be fulfilled, however. A sufficiently large set of training data, a high signal-to-noise ratio, and an optimized and exact fixation of components are required. The automated testing can be done subsequently by the machine. By integrating the test data of many components along the value chain further optimization including lifetime and durability

  2. Optimization of cooling tower performance analysis using Taguchi method

    OpenAIRE

    Ramkumar Ramakrishnan; Ragupathy Arumugam

    2013-01-01

    This study discuss the application of Taguchi method in assessing maximum cooling tower effectiveness for the counter flow cooling tower using expanded wire mesh packing. The experiments were planned based on Taguchi’s L27 orthogonal array .The trail was performed under different inlet conditions of flow rate of water, air and water temperature. Signal-to-noise ratio (S/N) analysis, analysis of variance (ANOVA) and regression were carried out in order to determine the effects of process...

  3. Method discussion of the performance evaluation on nuclear plant cable

    International Nuclear Information System (INIS)

    Lu Yongfang; Zhong Weixia; Sun Jiansheng; Liu Jingping

    2014-01-01

    A stock cable, which is same as the nuclear plant cable in service, was treated by thermal aging. After that, the mechanical property, the flame retardant property, the anti-oxidation were measured, and relationships between them due to the thermal aging were established. By those analysis, evaluating the in-service cable performance in nuclear plant and calculating its remaining life. Furthermore, the feasibility of this method was disscussed. (authors)

  4. A method for optimizing the performance of buildings

    Energy Technology Data Exchange (ETDEWEB)

    Pedersen, Frank

    2006-07-01

    This thesis describes a method for optimizing the performance of buildings. Design decisions made in early stages of the building design process have a significant impact on the performance of buildings, for instance, the performance with respect to the energy consumption, economical aspects, and the indoor environment. The method is intended for supporting design decisions for buildings, by combining methods for calculating the performance of buildings with numerical optimization methods. The method is able to find optimum values of decision variables representing different features of the building, such as its shape, the amount and type of windows used, and the amount of insulation used in the building envelope. The parties who influence design decisions for buildings, such as building owners, building users, architects, consulting engineers, contractors, etc., often have different and to some extent conflicting requirements to buildings. For instance, the building owner may be more concerned about the cost of constructing the building, rather than the quality of the indoor climate, which is more likely to be a concern of the building user. In order to support the different types of requirements made by decision-makers for buildings, an optimization problem is formulated, intended for representing a wide range of design decision problems for buildings. The problem formulation involves so-called performance measures, which can be calculated with simulation software for buildings. For instance, the annual amount of energy required by the building, the cost of constructing the building, and the annual number of hours where overheating occurs, can be used as performance measures. The optimization problem enables the decision-makers to specify many different requirements to the decision variables, as well as to the performance of the building. Performance measures can for instance be required to assume their minimum or maximum value, they can be subjected to upper or

  5. ACCOUNTING STUDENT’S LEARNING APPROACHES AND IMPACT ON ACADEMIC PERFORMANCE

    OpenAIRE

    Suhaiza Ismail

    2009-01-01

    The objective of the study is threefold. Firstly, the study explores the learning approaches adopted by students in completing their Business Finance. Secondly, it examines the impact that learning approaches has on the student’s academic performance. Finally, the study considers gender differences in the learning approaches adopted by students and in the relationship between learning approaches and academic performance. The Approaches and Study Skills Inventory for Students (ASSIST) was used...

  6. Total error components - isolation of laboratory variation from method performance

    International Nuclear Information System (INIS)

    Bottrell, D.; Bleyler, R.; Fisk, J.; Hiatt, M.

    1992-01-01

    The consideration of total error across sampling and analytical components of environmental measurements is relatively recent. The U.S. Environmental Protection Agency (EPA), through the Contract Laboratory Program (CLP), provides complete analyses and documented reports on approximately 70,000 samples per year. The quality assurance (QA) functions of the CLP procedures provide an ideal data base-CLP Automated Results Data Base (CARD)-to evaluate program performance relative to quality control (QC) criteria and to evaluate the analysis of blind samples. Repetitive analyses of blind samples within each participating laboratory provide a mechanism to separate laboratory and method performance. Isolation of error sources is necessary to identify effective options to establish performance expectations, and to improve procedures. In addition, optimized method performance is necessary to identify significant effects that result from the selection among alternative procedures in the data collection process (e.g., sampling device, storage container, mode of sample transit, etc.). This information is necessary to evaluate data quality; to understand overall quality; and to provide appropriate, cost-effective information required to support a specific decision

  7. PERFORMANCE MEASURES OF STUDENTS IN EXAMINATIONS: A STOCHASTIC APPROACH

    OpenAIRE

    Goutam Saha; GOUTAM SAHA

    2013-01-01

    Data on Secondary and Higher Secondary examination (science stream) results from Tripura (North-East India) schools are analyzed to measure the performance of students based on tests and also the performance measures of schools based on final results and continuous assessment processes are obtained. The result variation in terms of grade points in the Secondary and Higher Secondary examinations are analysed using different sets of performance measures. The transition probabilities from one g...

  8. Mixing Methods in Organizational Ethics and Organizational Innovativeness Research : Three Approaches to Mixed Methods Analysis

    OpenAIRE

    Riivari, Elina

    2015-01-01

    This chapter discusses three categories of mixed methods analysis techniques: variableoriented, case-oriented, and process/experience-oriented. All three categories combine qualitative and quantitative approaches to research methodology. The major differences among the categories are the focus of the study, available analysis techniques and timely aspect of the study. In variable-oriented analysis, the study focus is relationships between the research phenomena. In case-oriente...

  9. Methodical approaches in the Norwegian Master Plan for Water Resources

    International Nuclear Information System (INIS)

    Bowitz, Einar

    1997-01-01

    The Norwegian Master Plan for Water Resources instructs the management not to consider applications for concession to develop hydroelectric projects in the so called category II of the plan. These are the environmentally most controversial projects or the most expensive projects. This report discusses the methods used in this Master Plan to classify the projects. The question whether the assessments of the environmental disadvantages of hydropower development are reasonable is approached in two ways: (1) Compare the environmental costs imbedded in the Plan with direct assessments, and (2) Discuss the appropriateness of the methodology used for environmental evaluations in the Plan. The report concludes that (1) the environmental costs that can be derived from the ranking in the Plan are significantly greater than those following from direct evaluations, (2) the differences are generally so great that one may ask whether the methods used in the Plan overestimate the real environmental costs, (3) it seems to have been difficult to make a unified assessment of the environmental disadvantages, (4) the Plan has considered the economic impact on agriculture and forestry very roughly and indirectly, which may have contributed to overestimated environmental costs of hydropower development. 20 refs., 6 figs., 7 tabs

  10. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  11. Non-animal methods to predict skin sensitization (II): an assessment of defined approaches *.

    Science.gov (United States)

    Kleinstreuer, Nicole C; Hoffmann, Sebastian; Alépée, Nathalie; Allen, David; Ashikaga, Takao; Casey, Warren; Clouet, Elodie; Cluzel, Magalie; Desprez, Bertrand; Gellatly, Nichola; Göbel, Carsten; Kern, Petra S; Klaric, Martina; Kühnl, Jochen; Martinozzi-Teissier, Silvia; Mewes, Karsten; Miyazawa, Masaaki; Strickland, Judy; van Vliet, Erwin; Zang, Qingda; Petersohn, Dirk

    2018-05-01

    Skin sensitization is a toxicity endpoint of widespread concern, for which the mechanistic understanding and concurrent necessity for non-animal testing approaches have evolved to a critical juncture, with many available options for predicting sensitization without using animals. Cosmetics Europe and the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods collaborated to analyze the performance of multiple non-animal data integration approaches for the skin sensitization safety assessment of cosmetics ingredients. The Cosmetics Europe Skin Tolerance Task Force (STTF) collected and generated data on 128 substances in multiple in vitro and in chemico skin sensitization assays selected based on a systematic assessment by the STTF. These assays, together with certain in silico predictions, are key components of various non-animal testing strategies that have been submitted to the Organization for Economic Cooperation and Development as case studies for skin sensitization. Curated murine local lymph node assay (LLNA) and human skin sensitization data were used to evaluate the performance of six defined approaches, comprising eight non-animal testing strategies, for both hazard and potency characterization. Defined approaches examined included consensus methods, artificial neural networks, support vector machine models, Bayesian networks, and decision trees, most of which were reproduced using open source software tools. Multiple non-animal testing strategies incorporating in vitro, in chemico, and in silico inputs demonstrated equivalent or superior performance to the LLNA when compared to both animal and human data for skin sensitization.

  12. Advanced non-destructive methods for an efficient service performance

    International Nuclear Information System (INIS)

    Rauschenbach, H.; Clossen-von Lanken Schulz, M.; Oberlin, R.

    2015-01-01

    Due to the power generation industry's desire to decrease outage time and extend inspection intervals for highly stressed turbine parts, advanced and reliable Non-destructive methods were developed by Siemens Non-destructive laboratory. Effective outage performance requires the optimized planning of all outage activities as well as modern Non-destructive examination methods, in order to examine the highly stressed components (turbine rotor, casings, valves, generator rotor) reliably and in short periods of access. This paper describes the experience of Siemens Energy with an ultrasonic Phased Array inspection technique for the inspection of radial entry pinned turbine blade roots. The developed inspection technique allows the ultrasonic inspection of steam turbine blades without blade removal. Furthermore advanced Non-destructive examination methods for joint bolts will be described, which offer a significant reduction of outage duration in comparison to conventional inspection techniques. (authors)

  13. Performance prediction of electrohydrodynamic thrusters by the perturbation method

    International Nuclear Information System (INIS)

    Shibata, H.; Watanabe, Y.; Suzuki, K.

    2016-01-01

    In this paper, we present a novel method for analyzing electrohydrodynamic (EHD) thrusters. The method is based on a perturbation technique applied to a set of drift-diffusion equations, similar to the one introduced in our previous study on estimating breakdown voltage. The thrust-to-current ratio is generalized to represent the performance of EHD thrusters. We have compared the thrust-to-current ratio obtained theoretically with that obtained from the proposed method under atmospheric air conditions, and we have obtained good quantitative agreement. Also, we have conducted a numerical simulation in more complex thruster geometries, such as the dual-stage thruster developed by Masuyama and Barrett [Proc. R. Soc. A 469, 20120623 (2013)]. We quantitatively clarify the fact that if the magnitude of a third electrode voltage is low, the effective gap distance shortens, whereas if the magnitude of the third electrode voltage is sufficiently high, the effective gap distance lengthens.

  14. Continuing education for performance improvement: a creative approach.

    Science.gov (United States)

    Collins, Patti-Ann; Hardesty, Ilana; White, Julie L; Zisblatt, Lara

    2012-10-01

    In an effort to improve patient safety and health care outcomes, continuing medical education has begun to focus on performance improvement initiatives for physician practices. Boston University School of Medicine's (BUSM) Continuing Nursing Education Accredited Provider Unit has begun a creative project to award nursing contact hours for nurses' participation in performance improvement activities. This column highlights its initial efforts. Copyright 2012, SLACK Incorporated.

  15. Cooperation, competition, and team performance : Toward a contingency approach

    NARCIS (Netherlands)

    Beersma, Bianca; Hollenbeck, John R.; Humphrey, Stephen E.; Moon, Henry; Conlon, Donald E.; Ilgen, Daniel R.

    2003-01-01

    This study examined whether the relationship between reward structure and team performance is contingent upon task dimension, team composition, and individual performance level. Seventy-five four-person teams engaged in a simulated interactive task in which reward structure was manipulated. A

  16. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  17. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  18. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Directory of Open Access Journals (Sweden)

    Jordi Marcé-Nogué

    2017-10-01

    Full Text Available Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches.

  19. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Science.gov (United States)

    De Esteban-Trivigno, Soledad; Püschel, Thomas A.; Fortuny, Josep

    2017-01-01

    Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches. PMID:29043107

  20. Approaching complexity by stochastic methods: From biological systems to turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, Rudolf [Institute for Theoretical Physics, University of Muenster, D-48149 Muenster (Germany); Peinke, Joachim [Institute of Physics, Carl von Ossietzky University, D-26111 Oldenburg (Germany); Sahimi, Muhammad [Mork Family Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, CA 90089-1211 (United States); Reza Rahimi Tabar, M., E-mail: mohammed.r.rahimi.tabar@uni-oldenburg.de [Department of Physics, Sharif University of Technology, Tehran 11155-9161 (Iran, Islamic Republic of); Institute of Physics, Carl von Ossietzky University, D-26111 Oldenburg (Germany); Fachbereich Physik, Universitaet Osnabrueck, Barbarastrasse 7, 49076 Osnabrueck (Germany)

    2011-09-15

    This review addresses a central question in the field of complex systems: given a fluctuating (in time or space), sequentially measured set of experimental data, how should one analyze the data, assess their underlying trends, and discover the characteristics of the fluctuations that generate the experimental traces? In recent years, significant progress has been made in addressing this question for a class of stochastic processes that can be modeled by Langevin equations, including additive as well as multiplicative fluctuations or noise. Important results have emerged from the analysis of temporal data for such diverse fields as neuroscience, cardiology, finance, economy, surface science, turbulence, seismic time series and epileptic brain dynamics, to name but a few. Furthermore, it has been recognized that a similar approach can be applied to the data that depend on a length scale, such as velocity increments in fully developed turbulent flow, or height increments that characterize rough surfaces. A basic ingredient of the approach to the analysis of fluctuating data is the presence of a Markovian property, which can be detected in real systems above a certain time or length scale. This scale is referred to as the Markov-Einstein (ME) scale, and has turned out to be a useful characteristic of complex systems. We provide a review of the operational methods that have been developed for analyzing stochastic data in time and scale. We address in detail the following issues: (i) reconstruction of stochastic evolution equations from data in terms of the Langevin equations or the corresponding Fokker-Planck equations and (ii) intermittency, cascades, and multiscale correlation functions.

  1. Approaching complexity by stochastic methods: From biological systems to turbulence

    International Nuclear Information System (INIS)

    Friedrich, Rudolf; Peinke, Joachim; Sahimi, Muhammad; Reza Rahimi Tabar, M.

    2011-01-01

    This review addresses a central question in the field of complex systems: given a fluctuating (in time or space), sequentially measured set of experimental data, how should one analyze the data, assess their underlying trends, and discover the characteristics of the fluctuations that generate the experimental traces? In recent years, significant progress has been made in addressing this question for a class of stochastic processes that can be modeled by Langevin equations, including additive as well as multiplicative fluctuations or noise. Important results have emerged from the analysis of temporal data for such diverse fields as neuroscience, cardiology, finance, economy, surface science, turbulence, seismic time series and epileptic brain dynamics, to name but a few. Furthermore, it has been recognized that a similar approach can be applied to the data that depend on a length scale, such as velocity increments in fully developed turbulent flow, or height increments that characterize rough surfaces. A basic ingredient of the approach to the analysis of fluctuating data is the presence of a Markovian property, which can be detected in real systems above a certain time or length scale. This scale is referred to as the Markov-Einstein (ME) scale, and has turned out to be a useful characteristic of complex systems. We provide a review of the operational methods that have been developed for analyzing stochastic data in time and scale. We address in detail the following issues: (i) reconstruction of stochastic evolution equations from data in terms of the Langevin equations or the corresponding Fokker-Planck equations and (ii) intermittency, cascades, and multiscale correlation functions.

  2. Integrated healthcare networks' performance: a growth curve modeling approach.

    Science.gov (United States)

    Wan, Thomas T H; Wang, Bill B L

    2003-05-01

    This study examines the effects of integration on the performance ratings of the top 100 integrated healthcare networks (IHNs) in the United States. A strategic-contingency theory is used to identify the relationship of IHNs' performance to their structural and operational characteristics and integration strategies. To create a database for the panel study, the top 100 IHNs selected by the SMG Marketing Group in 1998 were followed up in 1999 and 2000. The data were merged with the Dorenfest data on information system integration. A growth curve model was developed and validated by the Mplus statistical program. Factors influencing the top 100 IHNs' performance in 1998 and their subsequent rankings in the consecutive years were analyzed. IHNs' initial performance scores were positively influenced by network size, number of affiliated physicians and profit margin, and were negatively associated with average length of stay and technical efficiency. The continuing high performance, judged by maintaining higher performance scores, tended to be enhanced by the use of more managerial or executive decision-support systems. Future studies should include time-varying operational indicators to serve as predictors of network performance.

  3. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  4. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  5. An hierarchical approach to performance evaluation of expert systems

    Science.gov (United States)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1985-01-01

    The number and size of expert systems is growing rapidly. Formal evaluation of these systems - which is not performed for many systems - increases the acceptability by the user community and hence their success. Hierarchical evaluation that had been conducted for computer systems is applied for expert system performance evaluation. Expert systems are also evaluated by treating them as software systems (or programs). This paper reports many of the basic concepts and ideas in the Performance Evaluation of Expert Systems Study being conducted at the University of Southwestern Louisiana.

  6. A high performance parallel approach to medical imaging

    International Nuclear Information System (INIS)

    Frieder, G.; Frieder, O.; Stytz, M.R.

    1988-01-01

    Research into medical imaging using general purpose parallel processing architectures is described and a review of the performance of previous medical imaging machines is provided. Results demonstrating that general purpose parallel architectures can achieve performance comparable to other, specialized, medical imaging machine architectures is presented. A new back-to-front hidden-surface removal algorithm is described. Results demonstrating the computational savings obtained by using the modified back-to-front hidden-surface removal algorithm are presented. Performance figures for forming a full-scale medical image on a mesh interconnected multiprocessor are presented

  7. Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study

    Science.gov (United States)

    2013-01-01

    Background Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM. Methods A mixed-methods study was conducted consisting of a controlled trial and focus groups with second year graduate medical students. Students received the EBM course delivered using either a didactic approach (DID) to learning EBM or a blended-learning approach (BL). Student competency in EBM was assessed using the Berlin tool and a criterion-based assessment task, with student perceptions on the interventions assessed qualitatively. Results A total of 61 students (85.9%) participated in the study. Competency in EBM did not differ between the groups when assessed using the Berlin tool (p = 0.29). Students using the BL approach performed significantly better in one of the criterion-based assessment tasks (p = 0.01) and reported significantly higher self-perceived competence in critical appraisal skills. Qualitative analysis identified that students had a preference for the EBM course to be delivered using the BL approach. Conclusions Implementing a blended-learning approach to EBM teaching promotes greater student appreciation of EBM principles within the clinical setting. Integrating a variety of teaching modalities and approaches can increase student self-confidence and assist in bridging the gap between the theory and practice of EBM. PMID:24341502

  8. Blended Approach to Occupational Performance (BAOP: Guidelines Enabling Children with Autism

    Directory of Open Access Journals (Sweden)

    Jordan M. Skowronski

    2017-01-01

    Full Text Available The performance of daily activities is impacted by motor impairments in children with autism spectrum disorders (ASD. Research has recently demonstrated the prevalence and specificity of motor impairments in people with ASD. The motor learning of individuals with ASD is partially intact, and evidence suggests that a method to alter skill learning and repeated practice of motor sequences might be beneficial. Aiming to use this knowledge to guide occupational therapy interventions, initial guidelines for children with ASD blended Cognitive Orientation to daily Occupational Performance (CO-OP with virtual reality (VR were created. An expert panel reviewed the initial guidelines. The results from the semi-structured expert panel discussion were to (a increase the number of sessions, (b provide more visuals to children, and (c use VR as a reinforcer. Guidelines were revised accordingly. The revised guidelines, called Blended Approach to Occupational Performance (BAOP, are ready for further testing.

  9. Multipolar Ewald methods, 1: theory, accuracy, and performance.

    Science.gov (United States)

    Giese, Timothy J; Panteva, Maria T; Chen, Haoyuan; York, Darrin M

    2015-02-10

    The Ewald, Particle Mesh Ewald (PME), and Fast Fourier–Poisson (FFP) methods are developed for systems composed of spherical multipole moment expansions. A unified set of equations is derived that takes advantage of a spherical tensor gradient operator formalism in both real space and reciprocal space to allow extension to arbitrary multipole order. The implementation of these methods into a novel linear-scaling modified “divide-and-conquer” (mDC) quantum mechanical force field is discussed. The evaluation times and relative force errors are compared between the three methods, as a function of multipole expansion order. Timings and errors are also compared within the context of the quantum mechanical force field, which encounters primary errors related to the quality of reproducing electrostatic forces for a given density matrix and secondary errors resulting from the propagation of the approximate electrostatics into the self-consistent field procedure, which yields a converged, variational, but nonetheless approximate density matrix. Condensed-phase simulations of an mDC water model are performed with the multipolar PME method and compared to an electrostatic cutoff method, which is shown to artificially increase the density of water and heat of vaporization relative to full electrostatic treatment.

  10. A kernel plus method for quantifying wind turbine performance upgrades

    KAUST Repository

    Lee, Giwhyun

    2014-04-21

    Power curves are commonly estimated using the binning method recommended by the International Electrotechnical Commission, which primarily incorporates wind speed information. When such power curves are used to quantify a turbine\\'s upgrade, the results may not be accurate because many other environmental factors in addition to wind speed, such as temperature, air pressure, turbulence intensity, wind shear and humidity, all potentially affect the turbine\\'s power output. Wind industry practitioners are aware of the need to filter out effects from environmental conditions. Toward that objective, we developed a kernel plus method that allows incorporation of multivariate environmental factors in a power curve model, thereby controlling the effects from environmental factors while comparing power outputs. We demonstrate that the kernel plus method can serve as a useful tool for quantifying a turbine\\'s upgrade because it is sensitive to small and moderate changes caused by certain turbine upgrades. Although we demonstrate the utility of the kernel plus method in this specific application, the resulting method is a general, multivariate model that can connect other physical factors, as long as their measurements are available, with a turbine\\'s power output, which may allow us to explore new physical properties associated with wind turbine performance. © 2014 John Wiley & Sons, Ltd.

  11. A new method to evaluate human-robot system performance

    Science.gov (United States)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  12. Corporate Social Responsibility Performance during Crisis. A EU Approach

    Directory of Open Access Journals (Sweden)

    Adina Dornean

    2016-05-01

    Full Text Available This paper aims at analyzing the impact of the financial crisis on Corporate Social Responsibility (CSR performance, emphasizing the case of companies from European Union (EU countries. An empirical analysis is conducted using the database available on Global Report Initiative (GRI. For accomplishing this, we will use Wilcoxon signed rank sum test, in order to test the CSR performance evolution for period 2007 – 2015. According to the GRI reporting guidelines we transform the application level of report standards in a point score system. The results indicated increased CSR performance before, during and after the financial crisis except for 2015, which confirm the results obtained by other researchers. The present study is important both for managers and policymakers: for managers to continue their CSR actions because is demonstrated the positive relationship between CSR and financial performance; and for authorities who have to adopt more incentives for supporting companies involved in CSR activities.

  13. Open Economy, Institutional Quality, and Environmental Performance: A Macroeconomic Approach

    Directory of Open Access Journals (Sweden)

    Amaryllis Mavragani

    2016-06-01

    Full Text Available As the subject of how economic development affects the quality of the natural environment has gained great momentum, this paper focuses on examining the extent to which the openness of a market economy and the quality of the institution affect environmental performance. The majority of the current studies focus on the Environmental Kuznets Curve and the level of economic growth. This paper addresses this question by relating environmental (“Environmental Performance Index” to macroeconomic (Gross Domestic Product per capita, “Open Markets Index” and governance indicators (“Worldwide Governance Indicators”. The sample consists of 75 countries, including all G20 and EU members, comprising “more than 90% of global trade and investment”. Findings show that the Environmental Performance Index is positively correlated to each of the (institutional indicators, so as to confirm that the selected indices are consistent with previous studies, suggesting that environmental performance increases in line with economic development and that good governance increases a country’s levels of environmental protection. By applying factor analysis, an empirical model of the Environmental Performance Index is estimated, suggesting that there is a significant positive correlation between a country’s economic growth, the openness of an economy, high levels of effective governance, and its environmental performance.

  14. Evaluating Method Engineer Performance: an error classification and preliminary empirical study

    Directory of Open Access Journals (Sweden)

    Steven Kelly

    1998-11-01

    Full Text Available We describe an approach to empirically test the use of metaCASE environments to model methods. Both diagrams and matrices have been proposed as a means for presenting the methods. These different paradigms may have their own effects on how easily and well users can model methods. We extend Batra's classification of errors in data modelling to cover metamodelling, and use it to measure the performance of a group of metamodellers using either diagrams or matrices. The tentative results from this pilot study confirm the usefulness of the classification, and show some interesting differences between the paradigms.

  15. Evaluating operator performance on full-scope simulators: A pragmatic approach to an intractable measurement problem

    International Nuclear Information System (INIS)

    Fuld, R.

    1989-01-01

    Industry trends toward full-scope, plant-referenced control room simulators have accelerated. The cost of such training is high, but the cost of training ineffectiveness is even higher if it permits serious errors or operator disqualification to occur. Effective measures of operator performance are needed, but the complexity of the task environment and the many aspects of and requirements for operator performance conspire to make such measurement a challenging problem. Combustion Engineering (C-E) Owners' Group task No. 572 was undertaken to develop a tractable and effective methodology for evaluating team performance in a requalification context on full-scope simulator scenarios. The following concepts were pursued as design goals for the method: 1. validity; 2. sensitivity; 3. reliability; 4. usability. In addition, the resulting approach was to meet the requirements of ES-601, Implementation Guidance of the NRC for Administration of Requalifying Exams. A survey of existing evaluation tools and techniques was made to determine the strengths and weaknesses of each. Based on those findings, a multimethod approach was developed drawing on the combined strengths of several general methods. The paper discusses procedural milestones, comments as subjective ratings, failure criteria, and tracked plant parameters

  16. Explaining African Growth Performance: A Production-Frontier Approach

    OpenAIRE

    Romain Houssa; Oleg Badunenko; Daniel J. Henderson

    2010-01-01

    This paper employs a production frontier approach that allows distinguishing technologic progress from efficiency development. Data on 35 African countries in 1970-2007 show that efficiency losses have constrained growth in Africa while technology progress has played a marginal growth enhancing role in the region. Moreover, physical and human capital accumulation are the main factors that drive productivity growth at the country level. Examining the outcomes of successful countries suggests t...

  17. Diversification Strategies and Firm Performance: A Sample Selection Approach

    OpenAIRE

    Santarelli, Enrico; Tran, Hien Thu

    2013-01-01

    This paper is based upon the assumption that firm profitability is determined by its degree of diversification which in turn is strongly related to the antecedent decision to carry out diversification activities. This calls for an empirical approach that permits the joint analysis of the three interrelated and consecutive stages of the overall diversification process: diversification decision, degree of diversification, and outcome of diversification. We apply parametric and semiparametric ap...

  18. Evaluating a physician leadership development program - a mixed methods approach.

    Science.gov (United States)

    Throgmorton, Cheryl; Mitchell, Trey; Morley, Tom; Snyder, Marijo

    2016-05-16

    Purpose - With the extent of change in healthcare today, organizations need strong physician leaders. To compensate for the lack of physician leadership education, many organizations are sending physicians to external leadership programs or developing in-house leadership programs targeted specifically to physicians. The purpose of this paper is to outline the evaluation strategy and outcomes of the inaugural year of a Physician Leadership Academy (PLA) developed and implemented at a Michigan-based regional healthcare system. Design/methodology/approach - The authors applied the theoretical framework of Kirkpatrick's four levels of evaluation and used surveys, observations, activity tracking, and interviews to evaluate the program outcomes. The authors applied grounded theory techniques to the interview data. Findings - The program met targeted outcomes across all four levels of evaluation. Interview themes focused on the significance of increasing self-awareness, building relationships, applying new skills, and building confidence. Research limitations/implications - While only one example, this study illustrates the importance of developing the evaluation strategy as part of the program design. Qualitative research methods, often lacking from learning evaluation design, uncover rich themes of impact. The study supports how a PLA program can enhance physician learning, engagement, and relationship building throughout and after the program. Physician leaders' partnership with organization development and learning professionals yield results with impact to individuals, groups, and the organization. Originality/value - Few studies provide an in-depth review of evaluation methods and outcomes of physician leadership development programs. Healthcare organizations seeking to develop similar in-house programs may benefit applying the evaluation strategy outlined in this study.

  19. Experimental Methods for UAV Aerodynamic and Propulsion Performance Assessment

    Directory of Open Access Journals (Sweden)

    Stefan ANTON

    2015-06-01

    Full Text Available This paper presents an experimental method for assessing the performances and the propulsion power of a UAV in several points based on telemetry. The points in which we make the estimations are chosen based on several criteria and the fallowing parameters are measured: airspeed, time-to-climb, altitude and the horizontal distance. With the estimated propulsion power and knowing the shaft motor power, the propeller efficiency is determined at several speed values. The shaft motor power was measured in the lab using the propeller as a break. Many flights, using the same UAV configuration, were performed before extracting flight data, in order to reduce the instrumental or statistic errors. This paper highlights both the methodology of processing the data and the validation of theoretical results.

  20. Exploration of submarine wake and powering performance using CFD method

    International Nuclear Information System (INIS)

    Huizhi, Y.; Hongcui, S.; Nan, Z.; Renyou, Y.; Liangmei, Y.

    2005-01-01

    In response to the needs of better design and less time, Computational Fluid Dynamic(CFD) methods have become an impartible part in the ship design, especially in the earlier design phases. In this paper FLUENT software was used to predict the wake character and powering performance of submarine at model scale. By an effective combination of the block topology, grid, turbulence model and validation, the simulation scheme was developed and applied to the predictions of multiple designs and optimizations of the earlier submarine design iterations. The incompressible RANS equations with different turbulence models were solved. To handle the block interface between the propeller and submarine stern, sliding girds in multiple blocks were employed, unstructural grids were used in the block around the propeller. Submarine with/without stator and/or propeller were studied. The flow feature, forces and powering performance at various conditions were calculated. The results were compared with experimental data, and a good agreement was obtained. (author)

  1. Numerical Methods Application for Reinforced Concrete Elements-Theoretical Approach for Direct Stiffness Matrix Method

    Directory of Open Access Journals (Sweden)

    Sergiu Ciprian Catinas

    2015-07-01

    Full Text Available A detailed theoretical and practical investigation of the reinforced concrete elements is due to recent techniques and method that are implemented in the construction market. More over a theoretical study is a demand for a better and faster approach nowadays due to rapid development of the calculus technique. The paper above will present a study for implementing in a static calculus the direct stiffness matrix method in order capable to address phenomena related to different stages of loading, rapid change of cross section area and physical properties. The method is a demand due to the fact that in our days the FEM (Finite Element Method is the only alternative to such a calculus and FEM are considered as expensive methods from the time and calculus resources point of view. The main goal in such a method is to create the moment-curvature diagram in the cross section that is analyzed. The paper above will express some of the most important techniques and new ideas as well in order to create the moment curvature graphic in the cross sections considered.

  2. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  3. A full digital approach to the TDCR method

    International Nuclear Information System (INIS)

    Mini, Giuliano; Pepe, Francesco; Tintori, Carlo; Capogni, Marco

    2014-01-01

    Current state of the art solutions based on the Triple to Double Coincidence Ratio method are generally large size, heavy-weight and not transportable systems. This is due, on one side, to large detectors and scintillation chambers and, on the other, to bulky analog electronics for data acquisition. CAEN developed a new, full digital approach to TDCR technique based on a portable, stand-alone, high-speed multichannel digitizer, on-board Digital Pulse Processing and dedicated DAQ software that emulates the well-known MAC3 analog board. - Highlights: • CAEN Desktop Digitizers used to emulate the MAC3 analog board in TDCR acquisition. • Spectroscopic application of the CAEN digitizers to the TDCR for charge spectra. • Development of two different softwares by CAEN and ENEA-INMRI for TDCR analysis. • Single electron peak obtained by CAEN digitizer and ENEA-INMRI portable TDCR. • Measurements of 90 Sr/ 90 Y by the new TDCR device equipped with CAEN digitizers

  4. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  5. Fuzzy Approach in Ranking of Banks according to Financial Performances

    Directory of Open Access Journals (Sweden)

    Milena Jakšić

    2016-01-01

    Full Text Available Evaluating bank performance on a yearly basis and making comparison among banks in certain time intervals provide an insight into general financial state of banks and their relative position with respect to the environment (creditors, investors, and stakeholders. The aim of this study is to propose a new fuzzy multicriteria model to evaluate banks respecting relative importance of financial performances and their values. The relative importance of each pair of financial performance groups is assessed linguistic expressions which are modeled by triangular fuzzy numbers. Fuzzy Analytic Hierarchical Process (FAHP is applied to determine relative weights of the financial performances. In order to rank the treated banks, new model based on Fuzzy Technique for Order Performance by Similarity to Ideal Solution (FTOPSIS is deployed. The proposed model is illustrated by an example giving real life data from 12 banks having 80% share of the Serbian market. In order to verify the proposed FTOPSIS different measures of separation are used. The presented solution enables the ranking of banks, gives an insight of bank’s state to stakeholders, and provides base for successful improvement in a field of strategy quality in bank business.

  6. Delineating species with DNA barcodes: a case of taxon dependent method performance in moths.

    Directory of Open Access Journals (Sweden)

    Mari Kekkonen

    Full Text Available The accelerating loss of biodiversity has created a need for more effective ways to discover species. Novel algorithmic approaches for analyzing sequence data combined with rapidly expanding DNA barcode libraries provide a potential solution. While several analytical methods are available for the delineation of operational taxonomic units (OTUs, few studies have compared their performance. This study compares the performance of one morphology-based and four DNA-based (BIN, parsimony networks, ABGD, GMYC methods on two groups of gelechioid moths. It examines 92 species of Finnish Gelechiinae and 103 species of Australian Elachistinae which were delineated by traditional taxonomy. The results reveal a striking difference in performance between the two taxa with all four DNA-based methods. OTU counts in the Elachistinae showed a wider range and a relatively low (ca. 65% OTU match with reference species while OTU counts were more congruent and performance was higher (ca. 90% in the Gelechiinae. Performance rose when only monophyletic species were compared, but the taxon-dependence remained. None of the DNA-based methods produced a correct match with non-monophyletic species, but singletons were handled well. A simulated test of morphospecies-grouping performed very poorly in revealing taxon diversity in these small, dull-colored moths. Despite the strong performance of analyses based on DNA barcodes, species delineated using single-locus mtDNA data are best viewed as OTUs that require validation by subsequent integrative taxonomic work.

  7. Methods of performing downhole operations using orbital vibrator energy sources

    Science.gov (United States)

    Cole, Jack H.; Weinberg, David M.; Wilson, Dennis R.

    2004-02-17

    Methods of performing down hole operations in a wellbore. A vibrational source is positioned within a tubular member such that an annulus is formed between the vibrational source and an interior surface of the tubular member. A fluid medium, such as high bulk modulus drilling mud, is disposed within the annulus. The vibrational source forms a fluid coupling with the tubular member through the fluid medium to transfer vibrational energy to the tubular member. The vibrational energy may be used, for example, to free a stuck tubular, consolidate a cement slurry and/or detect voids within a cement slurry prior to the curing thereof.

  8. High-performance liquid chromatographic method for guanylhydrazone compounds.

    Science.gov (United States)

    Cerami, C; Zhang, X; Ulrich, P; Bianchi, M; Tracey, K J; Berger, B J

    1996-01-12

    A high-performance liquid chromatographic method has been developed for a series of aromatic guanylhydrazones that have demonstrated therapeutic potential as anti-inflammatory agents. The compounds were separated using octadecyl or diisopropyloctyl reversed-phase columns, with an acetonitrile gradient in water containing heptane sulfonate, tetramethylammonium chloride, and phosphoric acid. The method was used to reliably quantify levels of analyte as low as 785 ng/ml, and the detector response was linear to at least 50 micrograms/ml using a 100 microliters injection volume. The assay system was used to determine the basic pharmacokinetics of a lead compound, CNI-1493, from serum concentrations following a single intravenous injection in rats.

  9. PERFORMANCE MANAGEMENT APPROACHES IN ECONOMIC ORGANIZATIONS USING INFORMATION TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Anca Mehedintu

    2012-03-01

    Full Text Available Performance management includes activities that ensure that goals are consistently being met inan effective and efficient manner. Performance management can focus on the performance of an organization, adepartment, employee, or even the processes to build a product or service, as well as many other areas.In these days of globalization and intensive use of information technology, the organizations must defineand implement an appropriate strategy that would support their medium-term development, stability andcompetitiveness. This is achieved through a coherent and interrelated set of activities for understanding thecustomer expectations and the level at which the offer of organization add value to customers and satisfy theirneeds, define their internal organization to allow timely response to market demands without losing focus on client,tracking strategy and business model for the accomplishment of the organization mission, aligning the existing ITproject management or under development implementation in organization with the strategic management oforganization etc. Strategic Management determines the improvement of processes, effective use of resources, focuson critical areas in terms of finance, creating opportunities for innovation and technological progress, improvementof the supply mechanism and the duty to promote personal interaction and negotiation at all levels, continuousassessment of organization and its technological trends, analyze the market potential and competence field etc.Strategic management system will not give good results if the strategy is not defined by a set of operationalobjectives clearly at all levels.Business performance is based on a set of analytical processes of business, supported by informationtechnology that defines the strategic goals that can be measured by performance indicators. EnterprisePerformance Management creates a powerful and precise environment, characterized by data consistency,efficiency analysis

  10. A simplified method for evaluating thermal performance of unglazed transpired solar collectors under steady state

    International Nuclear Information System (INIS)

    Wang, Xiaoliang; Lei, Bo; Bi, Haiquan; Yu, Tao

    2017-01-01

    Highlights: • A simplified method for evaluating thermal performance of UTC is developed. • Experiments, numerical simulations, dimensional analysis and data fitting are used. • The correlation of absorber plate temperature for UTC is established. • The empirical correlation of heat exchange effectiveness for UTC is proposed. - Abstract: Due to the advantages of low investment and high energy efficiency, unglazed transpired solar collectors (UTC) have been widely used for heating in buildings. However, it is difficult for designers to quickly evaluate the thermal performance of UTC based on the conventional methods such as experiments and numerical simulations. Therefore, a simple and fast method to determine the thermal performance of UTC is indispensable. The objective of this work is to provide a simplified calculation method to easily evaluate the thermal performance of UTC under steady state. Different parameters are considered in the simplified method, including pitch, perforation diameter, solar radiation, solar absorptivity, approach velocity, ambient air temperature, absorber plate temperature, and so on. Based on existing design parameters and operating conditions, correlations for the absorber plate temperature and the heat exchange effectiveness are developed using dimensional analysis and data fitting, respectively. Results show that the proposed simplified method has a high accuracy and can be employed to evaluate the collector efficiency, the heat exchange effectiveness and the air temperature rise. The proposed method in this paper is beneficial to directly determine design parameters and operating status for UTC.

  11. A linear regression approach to evaluate the green supply chain management impact on industrial organizational performance.

    Science.gov (United States)

    Mumtaz, Ubaidullah; Ali, Yousaf; Petrillo, Antonella

    2018-05-15

    The increase in the environmental pollution is one of the most important topic in today's world. In this context, the industrial activities can pose a significant threat to the environment. To manage problems associate to industrial activities several methods, techniques and approaches have been developed. Green supply chain management (GSCM) is considered one of the most important "environmental management approach". In developing countries such as Pakistan the implementation of GSCM practices is still in its initial stages. Lack of knowledge about its effects on economic performance is the reason because of industries fear to implement these practices. The aim of this research is to perceive the effects of GSCM practices on organizational performance in Pakistan. In this research the GSCM practices considered are: internal practices, external practices, investment recovery and eco-design. While, the performance parameters considered are: environmental pollution, operational cost and organizational flexibility. A set of hypothesis propose the effect of each GSCM practice on the performance parameters. Factor analysis and linear regression are used to analyze the survey data of Pakistani industries, in order to authenticate these hypotheses. The findings of this research indicate a decrease in environmental pollution and operational cost with the implementation of GSCM practices, whereas organizational flexibility has not improved for Pakistani industries. These results aim to help managers regarding their decision of implementing GSCM practices in the industrial sector of Pakistan. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Time pressure undermines performance more under avoidance than approach motivation

    NARCIS (Netherlands)

    Roskes, M.; Elliot, A.J.; Nijstad, B.A.; de Dreu, C.K.W.

    2013-01-01

    Four experiments were designed to test the hypothesis that performance is particularly undermined by time pressure when people are avoidance motivated. The results supported this hypothesis across three different types of tasks, including those well suited and those ill suited to the type of

  13. A Constructive Conceptual Approach to Strategic Performance Measurement

    DEFF Research Database (Denmark)

    Mitchell, Falconer; Nielsen, Lars Bråd; Nørreklit, Hanne

    This paper focuses on identifying the key characteristics of a good strategic performance. It does this from a conceptual base founded in the paradigm of pragmatic constructivism. This involves analysing real world activities such as strategy setting and implementation in terms of the facts on wh...

  14. Time Pressure Undermines Performance More Under Avoidance Than Approach Motivation

    NARCIS (Netherlands)

    Roskes, Marieke; Elliot, Andrew J.; Nijstad, Bernard A.; De Dreu, Carsten K. W.

    Four experiments were designed to test the hypothesis that performance is particularly undermined by time pressure when people are avoidance motivated. The results supported this hypothesis across three different types of tasks, including those well suited and those ill suited to the type of

  15. A Performative Approach to Teaching Care Ethics: A Case Study

    Science.gov (United States)

    Hamington, Maurice

    2012-01-01

    This article describes a unique experiment in reconceptualizing the teaching of ethics as an embodied, performative activity rather than a purely intellectual, scholarly study. Although the inclusion of corporeal dimensions in the teaching of ethics makes intuitive sense, because morality is all about how one acts in the world, ethics education in…

  16. Individual match approach to Bowling performance measures in ...

    African Journals Online (AJOL)

    Match conditions can play a significant role in player performances in a cricket match. If the pitch is in a good condition, the batsmen can achieve good scores, making it difficult for the bowlers. In the case of an uneven pitch or adverse weather conditions, the bowlers may have the upper hand. In order to measure bowlers' ...

  17. Storytelling for Fluency and Flair: A Performance-Based Approach

    Science.gov (United States)

    Campbell, Terry; Hlusek, Michelle

    2015-01-01

    In the classroom experiences described in this article, grade three students were introduced to storytelling through the interactive read aloud of a mentor text and a storytelling demonstration, followed by daily collaborative activities involving listening, speaking, reading, and writing, culminating in dramatic storytelling performances. The…

  18. A perturbative approach for enhancing the performance of time series forecasting.

    Science.gov (United States)

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Performance of some numerical Laplace inversion methods on American put option formula

    Science.gov (United States)

    Octaviano, I.; Yuniar, A. R.; Anisa, L.; Surjanto, S. D.; Putri, E. R. M.

    2018-03-01

    Numerical inversion approaches of Laplace transform is used to obtain a semianalytic solution. Some of the mathematical inversion methods such as Durbin-Crump, Widder, and Papoulis can be used to calculate American put options through the optimal exercise price in the Laplace space. The comparison of methods on some simple functions is aimed to know the accuracy and parameters which used in the calculation of American put options. The result obtained is the performance of each method regarding accuracy and computational speed. The Durbin-Crump method has an average error relative of 2.006e-004 with computational speed of 0.04871 seconds, the Widder method has an average error relative of 0.0048 with computational speed of 3.100181 seconds, and the Papoulis method has an average error relative of 9.8558e-004 with computational speed of 0.020793 seconds.

  20. Developing a multicriteria approach for the measurement of sustainable performance

    Energy Technology Data Exchange (ETDEWEB)

    Ding, G.

    2005-02-01

    In Australia, cost-benefit analysis (CBA) is one of the conventional tools used widely by the public and the private sectors in the appraisal of projects. It measures and compares the total costs and benefits of projects that are competing for scarce resources in monetary terms. Growing concerns that the values of environmental goods and services are often ignored or underestimated in the CBA approach have led to the overuse and depletion of environmental assets. A model of a sustainability index as an evaluation tool that combines economic, social and environmental criteria into an indexing algorithm is presented and described. The sustainability index uses monetary and non-monetary approaches to rank projects and facilities on their contribution to sustainability. This process enables the principle of trade-off to occur in the decision-making process and thereby allows environmental values to be considered when selecting a development option. This makes it possible to optimize financial return, maximize resource consumption and minimize detrimental effects to the natural and man-made world. A case study is used to demonstrate the model. (author)

  1. A hollow sphere soft lithography approach for long-term hanging drop methods.

    Science.gov (United States)

    Lee, Won Gu; Ortmann, Daniel; Hancock, Matthew J; Bae, Hojae; Khademhosseini, Ali

    2010-04-01

    In conventional hanging drop (HD) methods, embryonic stem cell aggregates or embryoid bodies (EBs) are often maintained in small inverted droplets. Gravity limits the volumes of these droplets to less than 50 microL, and hence such cell cultures can only be sustained for a few days without frequent media changes. Here we present a new approach to performing long-term HD methods (10-15 days) that can provide larger media reservoirs in a HD format to maintain more consistent culture media conditions. To implement this approach, we fabricated hollow sphere (HS) structures by injecting liquid drops into noncured poly(dimethylsiloxane) mixtures. These structures served as cell culture chambers with large media volumes (500 microL in each sphere) where EBs could grow without media depletion. The results showed that the sizes of the EBs cultured in the HS structures in a long-term HD format were approximately twice those of conventional HD methods after 10 days in culture. Further, HS cultures showed multilineage differentiation, similar to EBs cultured in the HD method. Due to its ease of fabrication and enhanced features, this approach may be of potential benefit as a stem cell culture method for regenerative medicine.

  2. Performance Evaluation of the Spectral Centroid Downshift Method for Attenuation Estimation

    OpenAIRE

    Samimi, Kayvan; Varghese, Tomy

    2015-01-01

    Estimation of frequency-dependent ultrasonic attenuation is an important aspect of tissue characterization. Along with other acoustic parameters studied in quantitative ultrasound, the attenuation coefficient can be used to differentiate normal and pathological tissue. The spectral centroid downshift (CDS) method is one the most common frequency-domain approaches applied to this problem. In this study, a statistical analysis of this method’s performance was carried out based on a parametric m...

  3. Performance Approach, Performance Avoidance and Depth of Information Processing: A Fresh Look at Relations between Students' Academic Motivation and Cognition.

    Science.gov (United States)

    Barker, Katrina L.; McInerney, Dennis M.; Dowson, Martin

    2002-01-01

    Examines effects of the motivational approach on the recall of verbal information processed at shallow and deep levels. Explains that students were assigned to a mastery focused condition, performance approach condition, or a control group. Reports that students remembered more stimulus words during cued recall than free recall. Includes…

  4. Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study.

    Science.gov (United States)

    Ilic, Dragan; Hart, William; Fiddes, Patrick; Misso, Marie; Villanueva, Elmer

    2013-12-17

    Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM. A mixed-methods study was conducted consisting of a controlled trial and focus groups with second year graduate medical students. Students received the EBM course delivered using either a didactic approach (DID) to learning EBM or a blended-learning approach (BL). Student competency in EBM was assessed using the Berlin tool and a criterion-based assessment task, with student perceptions on the interventions assessed qualitatively. A total of 61 students (85.9%) participated in the study. Competency in EBM did not differ between the groups when assessed using the Berlin tool (p = 0.29). Students using the BL approach performed significantly better in one of the criterion-based assessment tasks (p = 0.01) and reported significantly higher self-perceived competence in critical appraisal skills. Qualitative analysis identified that students had a preference for the EBM course to be delivered using the BL approach. Implementing a blended-learning approach to EBM teaching promotes greater student appreciation of EBM principles within the clinical setting. Integrating a variety of teaching modalities and approaches can increase student self-confidence and assist in bridging the gap between the theory and practice of EBM.

  5. Quantifying Neonatal Sucking Performance: Promise of New Methods.

    Science.gov (United States)

    Capilouto, Gilson J; Cunningham, Tommy J; Mullineaux, David R; Tamilia, Eleonora; Papadelis, Christos; Giannone, Peter J

    2017-04-01

    Neonatal feeding has been traditionally understudied so guidelines and evidence-based support for common feeding practices are limited. A major contributing factor to the paucity of evidence-based practice in this area has been the lack of simple-to-use, low-cost tools for monitoring sucking performance. We describe new methods for quantifying neonatal sucking performance that hold significant clinical and research promise. We present early results from an ongoing study investigating neonatal sucking as a marker of risk for adverse neurodevelopmental outcomes. We include quantitative measures of sucking performance to better understand how movement variability evolves during skill acquisition. Results showed the coefficient of variation of suck duration was significantly different between preterm neonates at high risk for developmental concerns (HRPT) and preterm neonates at low risk for developmental concerns (LRPT). For HRPT, results indicated the coefficient of variation of suck smoothness increased from initial feeding to discharge and remained significantly greater than healthy full-term newborns (FT) at discharge. There was no significant difference in our measures between FT and LRPT at discharge. Our findings highlight the need to include neonatal sucking assessment as part of routine clinical care in order to capture the relative risk of adverse neurodevelopmental outcomes at discharge. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  6. Comparison of Different Approaches to Predict the Performance of Pumps As Turbines (PATs

    Directory of Open Access Journals (Sweden)

    Mauro Venturini

    2018-04-01

    Full Text Available This paper deals with the comparison of different methods which can be used for the prediction of the performance curves of pumps as turbines (PATs. The considered approaches are four, i.e., one physics-based simulation model (“white box” model, two “gray box” models, which integrate theory on turbomachines with specific data correlations, and one “black box” model. More in detail, the modeling approaches are: (1 a physics-based simulation model developed by the same authors, which includes the equations for estimating head, power, and efficiency and uses loss coefficients and specific parameters; (2 a model developed by Derakhshan and Nourbakhsh, which first predicts the best efficiency point of a PAT and then reconstructs their complete characteristic curves by means of two ad hoc equations; (3 the prediction model developed by Singh and Nestmann, which predicts the complete turbine characteristics based on pump shape and size; (4 an Evolutionary Polynomial Regression model, which represents a data-driven hybrid scheme which can be used for identifying the explicit mathematical relationship between PAT and pump curves. All approaches are applied to literature data, relying on both pump and PAT performance curves of head, power, and efficiency over the entire range of operation. The experimental data were provided by Derakhshan and Nourbakhsh for four different turbomachines, working in both pump and PAT mode with specific speed values in the range 1.53–5.82. This paper provides a quantitative assessment of the predictions made by means of the considered approaches and also analyzes consistency from a physical point of view. Advantages and drawbacks of each method are also analyzed and discussed.

  7. A Control Approach for Performance of Big Data Systems

    OpenAIRE

    Berekmeri , Mihaly; Serrano , Damián; Bouchenak , Sara; Marchand , Nicolas; Robu , Bogdan

    2014-01-01

    International audience; We are at the dawn of a huge data explosion therefore companies have fast growing amounts of data to process. For this purpose Google developed MapReduce, a parallel programming paradigm which is slowly becoming the de facto tool for Big Data analytics. Although to some extent its use is already wide-spread in the industry, ensuring performance constraints for such a complex system poses great challenges and its management requires a high level of expertise. This paper...

  8. Improved approach to characterizing and presenting streak camera performance

    International Nuclear Information System (INIS)

    Wiedwald, J.D.; Jones, B.A.

    1985-01-01

    The performance of a streak camera recording system is strongly linked to the technique used to amplify, detect and quantify the streaked image. At the Lawrence Livermore National Laboratory (LLNL) streak camera images have been recorded both on film and by fiber-optically coupling to charge-coupled devices (CCD's). During the development of a new process for recording these images (lens coupling the image onto a cooled CCD) the definitions of important performance characteristics such as resolution and dynamic range were re-examined. As a result of this development, these performance characteristics are now presented to the streak camera user in a more useful format than in the past. This paper describes how these techniques are used within the Laser Fusion Program at LLNL. The system resolution is presented as a modulation transfer function, including the seldom reported effects that flare and light scattering have at low spatial frequencies. Data are presented such that a user can adjust image intensifier gain and pixel averaging to optimize the useful dynamic range in any particular application

  9. Sampling Approaches for Multi-Domain Internet Performance Measurement Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Calyam, Prasad

    2014-09-15

    The next-generation of high-performance networks being developed in DOE communities are critical for supporting current and emerging data-intensive science applications. The goal of this project is to investigate multi-domain network status sampling techniques and tools to measure/analyze performance, and thereby provide “network awareness” to end-users and network operators in DOE communities. We leverage the infrastructure and datasets available through perfSONAR, which is a multi-domain measurement framework that has been widely deployed in high-performance computing and networking communities; the DOE community is a core developer and the largest adopter of perfSONAR. Our investigations include development of semantic scheduling algorithms, measurement federation policies, and tools to sample multi-domain and multi-layer network status within perfSONAR deployments. We validate our algorithms and policies with end-to-end measurement analysis tools for various monitoring objectives such as network weather forecasting, anomaly detection, and fault-diagnosis. In addition, we develop a multi-domain architecture for an enterprise-specific perfSONAR deployment that can implement monitoring-objective based sampling and that adheres to any domain-specific measurement policies.

  10. Immigration and Firm Performance: a city-level approach

    Directory of Open Access Journals (Sweden)

    Mercedes Teruel Carrizosa

    2009-10-01

    Full Text Available This article analyses the effect of immigration flows on the growthand efficiency of manufacturing firms in Spanish cities. While most studies werefocusing on the effect immigrants have on labour markets at an aggregate level,here, we argue that the impact of immigration on firm performance should not onlybe considered in terms of the labour market, but also in terms of how city’s amenitiescan affect the performance of firms. Implementing a panel data methodology,we show that the immigrants’ increasing pressure has a positive effect on labourproductivity and wages and a negative effect on the job evolution of these manufacturingfirms. In addition, both small and new firms are more sensitive to thepressures of immigrant inflow, while foreign market oriented firms report higherproductivity levels and a less marked impact of immigration than their counterparts.We also present a set of instruments to control for endogeneity. It allows us toconfirm the effect of local immigration flows on the performance of manufacturingfirms.

  11. Quantifiable and objective approach to organizational performance enhancement.

    Energy Technology Data Exchange (ETDEWEB)

    Scholand, Andrew Joseph; Tausczik, Yla R. (University of Texas at Austin, Austin, TX)

    2009-10-01

    This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.

  12. ''In situ'' electronic testing method of a neutron detector performance

    International Nuclear Information System (INIS)

    Gonzalez, J.M.; Levai, F.

    1987-01-01

    The method allows detection of any important change in the electrical characteristics of a neutron sensor channel. It checks the response signal produced by an electronic detector circuit when a pulse generator is connected as input signal in the high voltage supply. The electronic circuit compares the detector capacitance value, previously measured, against a reference value, which is adjusted in a window type comparator electronic circuit to detect any important degrading condition of the capacitance value in a detector-cable system. The ''in-situ'' electronic testing method of neutron detector performance has been verified in a laboratory atmosphere to be a potential method to detect any significant change in the capacitance value of a nuclear sensor and its connecting cable, also checking: detector disconnections, cable disconnections, length changes of the connecting cable, electric short-opened circuits in the sensor channel, and any electrical trouble in the detector-connector-cable system. The experimental practices were carried out by simulation of several electric changes in a nuclear sensor-cable system from a linear D.C. channel which measures reactor power during nuclear reactor operation. It was made at the Training Reactor Electronic Laboratory. The results and conclusions obtained at the Laboratory were proved, satisfactorily, in the Electronic Instrumentation of Budapest Technical University Training Reactor, Hungary

  13. Performance of analytical methods for tomographic gamma scanning

    International Nuclear Information System (INIS)

    Prettyman, T.H.; Mercer, D.J.

    1997-01-01

    The use of gamma-ray computerized tomography for nondestructive assay of radioactive materials has led to the development of specialized analytical methods. Over the past few years, Los Alamos has developed and implemented a computer code, called ARC-TGS, for the analysis of data obtained by tomographic gamma scanning (TGS). ARC-TGS reduces TGS transmission and emission tomographic data, providing the user with images of the sample contents, the activity or mass of selected radionuclides, and an estimate of the uncertainty in the measured quantities. The results provided by ARC-TGS can be corrected for self-attenuation when the isotope of interest emits more than one gamma-ray. In addition, ARC-TGS provides information needed to estimate TGS quantification limits and to estimate the scan time needed to screen for small amounts of radioactivity. In this report, an overview of the analytical methods used by ARC-TGS is presented along with an assessment of the performance of these methods for TGS

  14. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    Bulej, Lubomír

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  15. Objective Method for Selecting Outdoor Reporting Conditions for Photovoltaic Performance

    International Nuclear Information System (INIS)

    Maish, A.

    1999-01-01

    Outdoor performance of photovoltaic modules and systems depends on prevailing conditions at the time of measurement. Outdoor test conditions must be relevant to device performance and readily attainable. Flat-plate, nonconcentrator PV device performance is reported with respect to fixed conditions referred to as Standard Reporting Conditions (SRC) of 1 kW/m plane of array total irradiance, 25 C device temperature, and a reference spectral distribution at air mass 1.5 under certain atmospheric conditions. We report a method of analyzing historical meteorological and irradiance data to determine the range of outdoor environmental parameters and solar irradiance components that affect solar collector performance when the SRC 1 kW/m total irradiance value occurs outdoors. We used data from the 30 year U.S. National Solar Radiation Data Base (NSRDB) , restricting irradiance conditions to within +/- 25 W/m of 1 kW/m on a solar tracking flat-plate collector. The distributions of environmental parameter values under these conditions are non-Gaussian and site dependent. Therefore the median, as opposed to the mean, of the observed distributions is chosen to represent appropriate outdoor reporting conditions. We found the average medians for the direct beam component (834 W/m), ambient temperature (24.4 C), total column water vapor (1.4 cm), and air mass (1.43) are near commonly used SRC values. Average median wind speed (4.4 m/s) and broadband aerosol optical depth (0.08) were significantly different from commonly used values

  16. Maximum entropy method approach to the θ term

    International Nuclear Information System (INIS)

    Imachi, Masahiro; Shinno, Yasuhiko; Yoneyama, Hiroshi

    2004-01-01

    In Monte Carlo simulations of lattice field theory with a θ term, one confronts the complex weight problem, or the sign problem. This is circumvented by performing the Fourier transform of the topological charge distribution P(Q). This procedure, however, causes flattening phenomenon of the free energy f(θ), which makes study of the phase structure unfeasible. In order to treat this problem, we apply the maximum entropy method (MEM) to a Gaussian form of P(Q), which serves as a good example to test whether the MEM can be applied effectively to the θ term. We study the case with flattering as well as that without flattening. In the latter case, the results of the MEM agree with those obtained from the direct application of the Fourier transform. For the former, the MEM gives a smoother f(θ) than that of the Fourier transform. Among various default models investigated, the images which yield the least error do not show flattening, although some others cannot be excluded given the uncertainly related to statistical error. (author)

  17. An approach to build knowledge base for reactor accident diagnostic system using statistical method

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Yokobayashi, Masao; Matsumoto, Kiyoshi; Fujii, Minoru

    1988-01-01

    In the development of a rule based expert system, one of key issues is how to build a knowledge base (KB). A systematic approach has been attempted for building an objective KB efficiently. The approach is based on the concept that a prototype KB should first be generated in a systematic way and then it is to be modified and/or improved by expert for practical use. The statistical method, Factor Analysis, was applied to build a prototype KB for the JAERI expert system DISKET using source information obtained from a PWR simulator. The prototype KB was obtained and the inference with this KB was performed against several types of transients. In each diagnosis, the transient type was well identified. From this study, it is concluded that the statistical method used is useful for building a prototype knowledge base. (author)

  18. An Approach for Performance Assessments of Extravehicular Activity Gloves

    Science.gov (United States)

    Aitchison, Lindsay; Benosn, Elizabeth

    2014-01-01

    The Space Suit Assembly (SSA) Development Team at NASA Johnson Space Center has invested heavily in the advancement of rear-entry planetary exploration suit design but largely deferred development of extravehicular activity (EVA) glove designs, and accepted the risk of using the current flight gloves, Phase VI, for unique mission scenarios outside the Space Shuttle and International Space Station (ISS) Program realm of experience. However, as design reference missions mature, the risks of using heritage hardware have highlighted the need for developing robust new glove technologies. To address the technology gap, the NASA Game-Changing Technology group provided start-up funding for the High Performance EVA Glove (HPEG) Project in the spring of 2012. The overarching goal of the HPEG Project is to develop a robust glove design that increases human performance during EVA and creates pathway for future implementation of emergent technologies, with specific aims of increasing pressurized mobility to 60% of barehanded capability, increasing the durability by 100%, and decreasing the potential of gloves to cause injury during use. The HPEG Project focused initial efforts on identifying potential new technologies and benchmarking the performance of current state of the art gloves to identify trends in design and fit leading to establish standards and metrics against which emerging technologies can be assessed at both the component and assembly levels. The first of the benchmarking tests evaluated the quantitative mobility performance and subjective fit of two sets of prototype EVA gloves developed ILC Dover and David Clark Company as compared to the Phase VI. Both companies were asked to design and fabricate gloves to the same set of NASA provided hand measurements (which corresponded to a single size of Phase Vi glove) and focus their efforts on improving mobility in the metacarpal phalangeal and carpometacarpal joints. Four test subjects representing the design-to hand

  19. Reliability assessment of serviceability performance of braced retaining walls using a neural network approach

    Science.gov (United States)

    Goh, A. T. C.; Kulhawy, F. H.

    2005-05-01

    In urban environments, one major concern with deep excavations in soft clay is the potentially large ground deformations in and around the excavation. Excessive movements can damage adjacent buildings and utilities. There are many uncertainties associated with the calculation of the ultimate or serviceability performance of a braced excavation system. These include the variabilities of the loadings, geotechnical soil properties, and engineering and geometrical properties of the wall. A risk-based approach to serviceability performance failure is necessary to incorporate systematically the uncertainties associated with the various design parameters. This paper demonstrates the use of an integrated neural network-reliability method to assess the risk of serviceability failure through the calculation of the reliability index. By first performing a series of parametric studies using the finite element method and then approximating the non-linear limit state surface (the boundary separating the safe and failure domains) through a neural network model, the reliability index can be determined with the aid of a spreadsheet. Two illustrative examples are presented to show how the serviceability performance for braced excavation problems can be assessed using the reliability index.

  20. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    Science.gov (United States)

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  1. A new approach for reliability analysis with time-variant performance characteristics

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2013-01-01

    Reliability represents safety level in industry practice and may variant due to time-variant operation condition and components deterioration throughout a product life-cycle. Thus, the capability to perform time-variant reliability analysis is of vital importance in practical engineering applications. This paper presents a new approach, referred to as nested extreme response surface (NERS), that can efficiently tackle time dependency issue in time-variant reliability analysis and enable to solve such problem by easily integrating with advanced time-independent tools. The key of the NERS approach is to build a nested response surface of time corresponding to the extreme value of the limit state function by employing Kriging model. To obtain the data for the Kriging model, the efficient global optimization technique is integrated with the NERS to extract the extreme time responses of the limit state function for any given system input. An adaptive response prediction and model maturation mechanism is developed based on mean square error (MSE) to concurrently improve the accuracy and computational efficiency of the proposed approach. With the nested response surface of time, the time-variant reliability analysis can be converted into the time-independent reliability analysis and existing advanced reliability analysis methods can be used. Three case studies are used to demonstrate the efficiency and accuracy of NERS approach

  2. Approach to geologic repository post closure system performance assessment

    International Nuclear Information System (INIS)

    Pahwa, S.B.; Felton, W.; Duguid, J.O.

    1992-01-01

    An essential part of the license application for a geologic repository will be the demonstration of compliance with the standards set by the Environmental Protection Agency. The performance assessments that produce the demonstration must rely on models of various levels of detail. The most detailed of these models are needed for understanding thoroughly the complex physical and chemical processes affecting the behavior of the system. For studying the behavior of major components of the system, less detailed models are often useful. For predicting the behavior of the total system, models of a third kind may be needed. These models must cover all the important processes that contribute to the behavior of the system, because they must estimate the behavior under all significant conditions for 10,000 years. In addition, however, computer codes that embody these models must calculate very rapidly because of the EPA standard's requirement for probabilistic estimates, which will be produced by sampling thousands of times from probability distributions of parameters. For this reason, the total-system models must be less complex than the detailed-process and subsystem models. The total-system performance is evaluated through modeling of the following components: Radionuclide release from the engineered-barrier system. Fluid flow in the geologic units. Radionuclide transport to the accessible environment. Radionuclide release to the accessible environment and dose to man

  3. Approaching human performance the functionality-driven Awiwi robot hand

    CERN Document Server

    Grebenstein, Markus

    2014-01-01

    Humanoid robotics have made remarkable progress since the dawn of robotics. So why don't we have humanoid robot assistants in day-to-day life yet? This book analyzes the keys to building a successful humanoid robot for field robotics, where collisions become an unavoidable part of the game. The author argues that the design goal should be real anthropomorphism, as opposed to mere human-like appearance. He deduces three major characteristics to aim for when designing a humanoid robot, particularly robot hands: _ Robustness against impacts _ Fast dynamics _ Human-like grasping and manipulation performance   Instead of blindly copying human anatomy, this book opts for a holistic design me-tho-do-lo-gy. It analyzes human hands and existing robot hands to elucidate the important functionalities that are the building blocks toward these necessary characteristics.They are the keys to designing an anthropomorphic robot hand, as illustrated in the high performance anthropomorphic Awiwi Hand presented in this book.  ...

  4. Paired comparisons analysis: an axiomatic approach to ranking methods

    NARCIS (Netherlands)

    Gonzalez-Diaz, J.; Hendrickx, Ruud; Lohmann, E.R.M.A.

    2014-01-01

    In this paper we present an axiomatic analysis of several ranking methods for general tournaments. We find that the ranking method obtained by applying maximum likelihood to the (Zermelo-)Bradley-Terry model, the most common method in statistics and psychology, is one of the ranking methods that

  5. Activity Based Costing (ABC as an Approach to Optimize Purchasing Performance in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Mohamed S. El-Deeb

    2011-07-01

    Full Text Available ABC (Activity Based Costing system has proved success in both products and services. The researchers propose using a new model through the application of ABC approach that can be implemented in purchasing department as one of the most dynamic departments in service sector to optimize purchasing activities performance. The researchers propose purchasing measures, targeting customers’ loyalty ensuring the continuous flow of supplies. The researchers used the questionnaire as a tool of data collection method for verifying the hypothesis of the research. Data obtained was analyzed by using Statistical Package for Social Sciences (SPSS. The results of the research based on limited survey that have been distributed to number of hotels in Great Cairo region. Our research was targeting three hundred purchasing manager and staff through five star hotels. It is recognized that further research is necessary to establish the exact nature of the causal linkages between proposed performance measures and strategic intent in order to gain insights into practice elsewhere.

  6. Improving operational anodising process performance using simulation approach

    International Nuclear Information System (INIS)

    Liong, Choong-Yeun; Ghazali, Syarah Syahidah

    2015-01-01

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time

  7. Improving operational anodising process performance using simulation approach

    Energy Technology Data Exchange (ETDEWEB)

    Liong, Choong-Yeun, E-mail: lg@ukm.edu.my; Ghazali, Syarah Syahidah, E-mail: syarah@gapps.kptm.edu.my [School of Mathematical Sciences, Faculty of Science and Technology, Universiti Kebangsaan Malaysia, 43600 UKM Bangi, Selangor DE (Malaysia)

    2015-10-22

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.

  8. Improving operational anodising process performance using simulation approach

    Science.gov (United States)

    Liong, Choong-Yeun; Ghazali, Syarah Syahidah

    2015-10-01

    The use of aluminium is very widespread, especially in transportation, electrical and electronics, architectural, automotive and engineering applications sectors. Therefore, the anodizing process is an important process for aluminium in order to make the aluminium durable, attractive and weather resistant. This research is focused on the anodizing process operations in manufacturing and supplying of aluminium extrusion. The data required for the development of the model is collected from the observations and interviews conducted in the study. To study the current system, the processes involved in the anodizing process are modeled by using Arena 14.5 simulation software. Those processes consist of five main processes, namely the degreasing process, the etching process, the desmut process, the anodizing process, the sealing process and 16 other processes. The results obtained were analyzed to identify the problems or bottlenecks that occurred and to propose improvement methods that can be implemented on the original model. Based on the comparisons that have been done between the improvement methods, the productivity could be increased by reallocating the workers and reducing loading time.

  9. Performance of various mathematical methods for calculation of radioimmunoassay results

    International Nuclear Information System (INIS)

    Sandel, P.; Vogt, W.

    1977-01-01

    Interpolation and regression methods are available for computer aided determination of radioimmunological end results. We compared the performance of eight algorithms (weighted and unweighted linear logit-log regression, quadratic logit-log regression, Rodbards logistic model in the weighted and unweighted form, smoothing spline interpolation with a large and small smoothing factor and polygonal interpolation) on the basis of three radioimmunoassays with different reference curve characteristics (digoxin, estriol, human chorionic somatomammotropin = HCS). Great store was set by the accuracy of the approximation at the intermediate points on the curve, ie. those points that lie midway between two standard concentrations. These concentrations were obtained by weighing and inserted as unknown samples. In the case of digoxin and estriol the polygonal interpolation provided the best results while the weighted logit-log regression proved superior in the case of HCS. (orig.) [de

  10. Improving the road wear performance of heavy vehicles in South Africa using a performance-based standards approach

    CSIR Research Space (South Africa)

    Nordengen, Paul A

    2010-05-01

    Full Text Available of the world to achieve regional harmonisation and effective road use have had limited success. Another approach is to consider performance-based standards (PBS); in this case standards specify the performance required from the operation of a vehicle on a...

  11. Easy and difficult performance-approach goals : Their moderating effect on the link between task interest and performance attainment

    NARCIS (Netherlands)

    Blaga, Monica; Van Yperen, N.W.

    2008-01-01

    The purpose of this study was to demonstrate that the positive link between task interest and performance attainment can he negatively affected by the pursuit of difficult performance-approach goals. This was tested in a sample of 60 undergraduate Students at a Dutch university, In line with

  12. Solution of the neutron point kinetics equations with temperature feedback effects applying the polynomial approach method

    Energy Technology Data Exchange (ETDEWEB)

    Tumelero, Fernanda, E-mail: fernanda.tumelero@yahoo.com.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Petersen, Claudio Z.; Goncalves, Glenio A.; Lazzari, Luana, E-mail: claudiopeteren@yahoo.com.br, E-mail: gleniogoncalves@yahoo.com.br, E-mail: luana-lazzari@hotmail.com [Universidade Federal de Pelotas (DME/UFPEL), Capao do Leao, RS (Brazil). Instituto de Fisica e Matematica

    2015-07-01

    In this work, we present a solution of the Neutron Point Kinetics Equations with temperature feedback effects applying the Polynomial Approach Method. For the solution, we consider one and six groups of delayed neutrons precursors with temperature feedback effects and constant reactivity. The main idea is to expand the neutron density, delayed neutron precursors and temperature as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions of the problem and the analytical continuation is used to determine the solutions of the next intervals. With the application of the Polynomial Approximation Method it is possible to overcome the stiffness problem of the equations. In such a way, one varies the time step size of the Polynomial Approach Method and performs an analysis about the precision and computational time. Moreover, we compare the method with different types of approaches (linear, quadratic and cubic) of the power series. The answer of neutron density and temperature obtained by numerical simulations with linear approximation are compared with results in the literature. (author)

  13. Approach and methods to evaluate the uncertainty in system thermalhydraulic calculations

    International Nuclear Information System (INIS)

    D'Auria, F.

    2004-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. An activity in progress at the International Atomic Energy Agency (IAEA) is considered. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermalhydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  14. Traction cytometry: regularization in the Fourier approach and comparisons with finite element method.

    Science.gov (United States)

    Kulkarni, Ankur H; Ghosh, Prasenjit; Seetharaman, Ashwin; Kondaiah, Paturu; Gundiah, Namrata

    2018-05-09

    Traction forces exerted by adherent cells are quantified using displacements of embedded markers on polyacrylamide substrates due to cell contractility. Fourier Transform Traction Cytometry (FTTC) is widely used to calculate tractions but has inherent limitations due to errors in the displacement fields; these are mitigated through a regularization parameter (γ) in the Reg-FTTC method. An alternate finite element (FE) approach computes tractions on a domain using known boundary conditions. Robust verification and recovery studies are lacking but essential in assessing the accuracy and noise sensitivity of the traction solutions from the different methods. We implemented the L2 regularization method and defined a maximum curvature point in the traction with γ plot as the optimal regularization parameter (γ*) in the Reg-FTTC approach. Traction reconstructions using γ* yield accurate values of low and maximum tractions (Tmax) in the presence of up to 5% noise. Reg-FTTC is hence a clear improvement over the FTTC method but is inadequate to reconstruct low stresses such as those at nascent focal adhesions. FE, implemented using a node-by-node comparison, showed an intermediate reconstruction compared to Reg-FTTC. We performed experiments using mouse embryonic fibroblast (MEF) and compared results between these approaches. Tractions from FTTC and FE showed differences of ∼92% and 22% as compared to Reg-FTTC. Selection of an optimum value of γ for each cell reduced variability in the computed tractions as compared to using a single value of γ for all the MEF cells in this study.

  15. Solution of the neutron point kinetics equations with temperature feedback effects applying the polynomial approach method

    International Nuclear Information System (INIS)

    Tumelero, Fernanda; Petersen, Claudio Z.; Goncalves, Glenio A.; Lazzari, Luana

    2015-01-01

    In this work, we present a solution of the Neutron Point Kinetics Equations with temperature feedback effects applying the Polynomial Approach Method. For the solution, we consider one and six groups of delayed neutrons precursors with temperature feedback effects and constant reactivity. The main idea is to expand the neutron density, delayed neutron precursors and temperature as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions of the problem and the analytical continuation is used to determine the solutions of the next intervals. With the application of the Polynomial Approximation Method it is possible to overcome the stiffness problem of the equations. In such a way, one varies the time step size of the Polynomial Approach Method and performs an analysis about the precision and computational time. Moreover, we compare the method with different types of approaches (linear, quadratic and cubic) of the power series. The answer of neutron density and temperature obtained by numerical simulations with linear approximation are compared with results in the literature. (author)

  16. Battery Performance Modelling ad Simulation: a Neural Network Based Approach

    Science.gov (United States)

    Ottavianelli, Giuseppe; Donati, Alessandro

    2002-01-01

    This project has developed on the background of ongoing researches within the Control Technology Unit (TOS-OSC) of the Special Projects Division at the European Space Operations Centre (ESOC) of the European Space Agency. The purpose of this research is to develop and validate an Artificial Neural Network tool (ANN) able to model, simulate and predict the Cluster II battery system's performance degradation. (Cluster II mission is made of four spacecraft flying in tetrahedral formation and aimed to observe and study the interaction between sun and earth by passing in and out of our planet's magnetic field). This prototype tool, named BAPER and developed with a commercial neural network toolbox, could be used to support short and medium term mission planning in order to improve and maximise the batteries lifetime, determining which are the future best charge/discharge cycles for the batteries given their present states, in view of a Cluster II mission extension. This study focuses on the five Silver-Cadmium batteries onboard of Tango, the fourth Cluster II satellite, but time restrains have allowed so far to perform an assessment only on the first battery. In their most basic form, ANNs are hyper-dimensional curve fits for non-linear data. With their remarkable ability to derive meaning from complicated or imprecise history data, ANN can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. ANNs learn by example, and this is why they can be described as an inductive, or data-based models for the simulation of input/target mappings. A trained ANN can be thought of as an "expert" in the category of information it has been given to analyse, and this expert can then be used, as in this project, to provide projections given new situations of interest and answer "what if" questions. The most appropriate algorithm, in terms of training speed and memory storage requirements, is clearly the Levenberg

  17. Stereotype Threat and College Academic Performance: A Latent Variables Approach*

    Science.gov (United States)

    Owens, Jayanti; Massey, Douglas S.

    2013-01-01

    Stereotype threat theory has gained experimental and survey-based support in helping explain the academic underperformance of minority students at selective colleges and universities. Stereotype threat theory states that minority students underperform because of pressures created by negative stereotypes about their racial group. Past survey-based studies, however, are characterized by methodological inefficiencies and potential biases: key theoretical constructs have only been measured using summed indicators and predicted relationships modeled using ordinary least squares. Using the National Longitudinal Survey of Freshman, this study overcomes previous methodological shortcomings by developing a latent construct model of stereotype threat. Theoretical constructs and equations are estimated simultaneously from multiple indicators, yielding a more reliable, valid, and parsimonious test of key propositions. Findings additionally support the view that social stigma can indeed have strong negative effects on the academic performance of pejoratively stereotyped racial-minority group members, not only in laboratory settings, but also in the real world. PMID:23950616

  18. An approach for prediction of petroleum production facility performance considering Arctic influence factors

    International Nuclear Information System (INIS)

    Gao Xueli; Barabady, Javad; Markeset, Tore

    2010-01-01

    As the oil and gas (O and G) industry is increasing the focus on petroleum exploration and development in the Arctic region, it is becoming increasingly important to design exploration and production facilities to suit the local operating conditions. The cold and harsh climate, the long distance from customer and suppliers' markets, and the sensitive environment may have considerable influence on the choice of design solutions and production performance characteristics such as throughput capacity, reliability, availability, maintainability, and supportability (RAMS) as well as operational and maintenance activities. Due to this, data and information collected for similar systems used in a normal climate may not be suitable. Hence, it is important to study and develop methods for prediction of the production performance characteristics during the design and operation phases. The aim of this paper is to present an approach for prediction of the production performance for oil and gas production facilities considering influencing factors in Arctic conditions. The proportional repair model (PRM) is developed in order to predict repair rate in Arctic conditions. The model is based on the proportional hazard model (PHM). A simple case study is used to demonstrate how the proposed approach can be applied.

  19. Co-evolving prisoner's dilemma: Performance indicators and analytic approaches

    Science.gov (United States)

    Zhang, W.; Choi, C. W.; Li, Y. S.; Xu, C.; Hui, P. M.

    2017-02-01

    Understanding the intrinsic relation between the dynamical processes in a co-evolving network and the necessary ingredients in formulating a reliable theory is an important question and a challenging task. Using two slightly different definitions of performance indicator in the context of a co-evolving prisoner's dilemma game, it is shown that very different cooperative levels result and theories of different complexity are required to understand the key features. When the payoff per opponent is used as the indicator (Case A), non-cooperative strategy has an edge and dominates in a large part of the parameter space formed by the cutting-and-rewiring probability and the strategy imitation probability. When the payoff from all opponents is used (Case B), cooperative strategy has an edge and dominates the parameter space. Two distinct phases, one homogeneous and dynamical and another inhomogeneous and static, emerge and the phase boundary in the parameter space is studied in detail. A simple theory assuming an average competing environment for cooperative agents and another for non-cooperative agents is shown to perform well in Case A. The same theory, however, fails badly for Case B. It is necessary to include more spatial correlation into a theory for Case B. We show that the local configuration approximation, which takes into account of the different competing environments for agents with different strategies and degrees, is needed to give reliable results for Case B. The results illustrate that formulating a proper theory requires both a conceptual understanding of the effects of the adaptive processes in the problem and a delicate balance between simplicity and accuracy.

  20. A rapid method of evaluating fluoroscopic system performance

    International Nuclear Information System (INIS)

    Sprawls, P.

    1989-01-01

    This paper presents a study to develop a method for the rapid evaluation and documentation of fluoroscopic image quality. All objects contained within a conventional contrast-detail test phantom (Leeds TO-10) are displayed in an array format according to their contrast and size. A copy of the display is used as the data collection form and a permanent record of system performance. A fluoroscope is evaluated by viewing the test phantom and marking the visible objects on the display. A line drawn through the objects with minimum visibility in each size group forms a contrast-detail curve for the system. This is compared with a standard or reference line, which is in the display.Deviations in curve position are useful indicators of specific image quality problems, such as excessive noise or blurring. The use of a special object-visibility array format display makes it possible to collect data, analyze the results, and create a record of fluoroscopic performance in less than 2 minutes for each viewing mode

  1. Mechanical/structural performance test method of a spacer grid

    International Nuclear Information System (INIS)

    Yoon, Kyung Ho

    2000-06-01

    The spacer grid is one of the main structural components in the fuel assembly, which supports the fuel rods, guides cooling water, and protects the system from an external impact load, such as earthquakes. In order to develop the spacer grid with the high mechanical performance, the mechanical and structural properties of the spacer grids must be extensively examined while designing it. In this report, the mechanical/structural test methods, i.e. the characteristic test of a spacer grid spring or dimple, static buckling test of a partial or full size spacer grid and dynamic impact test of them are described. The characteristic test of a spacer grid spring or dimple is accomplished with universal tensile test machine, a specimen is fixed with test fixture and then applied compressive load. The characteristic test data is saved at loading and unloading event. The static buckling test of a partial or full size spacer grid is executed with the same universal tensile testing machine, a specimen is fixed between cross-heads and then applied the compressive load. The buckling strength is decided the maximum strength at load vs. displacement curve. The dynamic impact test of a partial or full size spacer grid is performed with pendulum type impact machine and free fall shock test machine, a specimen is fixed with test fixture and then applied the impact load by impact hammer. Specially, the pendulum type impact test machine is also possible under the operating temperature because a furnace is separately attached with test machine

  2. Review of scenario selection approaches for performance assessment of high-level waste repositories and related issues

    International Nuclear Information System (INIS)

    Banano, E.J.; Baca, R.G.

    1995-08-01

    The selection of scenarios representing plausible realizations of the future conditions-with associated probabilities of occurrence-that can affect the long-term performance of a high-level radioactive waste (HLW) repository is the commonly used method for treating the uncertainty in the prediction of the future states of the system. This method, conventionally referred to as the ''scenario approach,'' while common is not the only method to deal with this uncertainty; other method ''ch as the environmental simulation approach (ESA), have also been proposed. Two of the difficulties with the scenario approach are the lack of uniqueness in the definition of the term ''scenario'' and the lack of uniqueness in the approach to formulate scenarios, which relies considerably on subjective judgments. Consequently, it is difficult to assure that a complete and unique set of scenarios can be defined for use in a performance assessment. Because scenarios are key to the determination of the long-term performance of the repository system, this lack of uniqueness can present a considerable challenge when attempting to reconcile the set of scenarios, and their level of detail, obtained using different approaches, particularly among proponents and regulators of a HLW repository

  3. Review of scenario selection approaches for performance assessment of high-level waste repositories and related issues.

    Energy Technology Data Exchange (ETDEWEB)

    Banano, E.J. [Beta Corporation International, Albuquerque, NM (United States); Baca, R.G. [Southwest Research Inst., San Antonio, TX (United States). Center for Nuclear Waste Regulatory Analyses

    1995-08-01

    The selection of scenarios representing plausible realizations of the future conditions-with associated probabilities of occurrence-that can affect the long-term performance of a high-level radioactive waste (HLW) repository is the commonly used method for treating the uncertainty in the prediction of the future states of the system. This method, conventionally referred to as the ``scenario approach,`` while common is not the only method to deal with this uncertainty; other method ``ch as the environmental simulation approach (ESA), have also been proposed. Two of the difficulties with the scenario approach are the lack of uniqueness in the definition of the term ``scenario`` and the lack of uniqueness in the approach to formulate scenarios, which relies considerably on subjective judgments. Consequently, it is difficult to assure that a complete and unique set of scenarios can be defined for use in a performance assessment. Because scenarios are key to the determination of the long-term performance of the repository system, this lack of uniqueness can present a considerable challenge when attempting to reconcile the set of scenarios, and their level of detail, obtained using different approaches, particularly among proponents and regulators of a HLW repository.

  4. Physics methods for calculating light water reactor increased performances

    International Nuclear Information System (INIS)

    Vandenberg, C.; Charlier, A.

    1988-01-01

    The intensive use of light water reactors (LWRs) has induced modification of their characteristics and performances in order to improve fissile material utilization and to increase their availability and flexibility under operation. From the conceptual point of view, adequate methods must be used to calculate core characteristics, taking into account present design requirements, e.g., use of burnable poison, plutonium recycling, etc. From the operational point of view, nuclear plants that have been producing a large percentage of electricity in some countries must adapt their planning to the need of the electrical network and operate on a load-follow basis. Consequently, plant behavior must be predicted and accurately followed in order to improve the plant's capability within safety limits. The Belgonucleaire code system has been developed and extensively validated. It is an accurate, flexible, easily usable, fast-running tool for solving the problems related to LWR technology development. The methods and validation of the two computer codes LWR-WIMS and MICROLUX, which are the main components of the physics calculation system, are explained

  5. Model Multi Criteria Decision Making with Fuzzy ANP Method for Performance Measurement Small Medium Enterprise (SME)

    Science.gov (United States)

    Rahmanita, E.; Widyaningrum, V. T.; Kustiyahningsih, Y.; Purnama, J.

    2018-04-01

    SMEs have a very important role in the development of the economy in Indonesia. SMEs assist the government in terms of creating new jobs and can support household income. The number of SMEs in Madura and the number of measurement indicators in the SME mapping so that it requires a method.This research uses Fuzzy Analytic Network Process (FANP) method for performance measurement SME. The FANP method can handle data that contains uncertainty. There is consistency index in determining decisions. Performance measurement in this study is based on a perspective of the Balanced Scorecard. This research approach integrated internal business perspective, learning, and growth perspective and fuzzy Analytic Network Process (FANP). The results of this research areframework a priority weighting of assessment indicators SME.

  6. Vehicle safety performance improvements using a performance-based standards approach: four case studies

    CSIR Research Space (South Africa)

    Nordengen, Paul A

    2014-01-01

    Full Text Available programme is to gain practical experience in the PBS approach and to quantify and evaluate the potential infrastructure preservation, safety and productivity benefits for road freight transport. The Smart Truck demonstration vehicles have been designed...

  7. Optimization of the gas turbine-modular helium reactor using statistical methods to maximize performance without compromising system design margins

    International Nuclear Information System (INIS)

    Lommers, L.J.; Parme, L.L.; Shenoy, A.S.

    1995-07-01

    This paper describes a statistical approach for determining the impact of system performance and design uncertainties on power plant performance. The objectives of this design approach are to ensure that adequate margin is provided, that excess margin is minimized, and that full advantage can be taken of unconsumed margin. It is applicable to any thermal system in which these factors are important. The method is demonstrated using the Gas Turbine Modular Helium Reactor as an example. The quantitative approach described allows the characterization of plant performance and the specification of the system design requirements necessary to achieve the desired performance with high confidence. Performance variations due to design evolution, inservice degradation, and basic performance uncertainties are considered. The impact of all performance variabilities is combined using Monte Carlo analysis to predict the range of expected operation

  8. A Review of Lightweight Thread Approaches for High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Castello, Adrian; Pena, Antonio J.; Seo, Sangmin; Mayo, Rafael; Balaji, Pavan; Quintana-Orti, Enrique S.

    2016-09-12

    High-level, directive-based solutions are becoming the programming models (PMs) of the multi/many-core architectures. Several solutions relying on operating system (OS) threads perfectly work with a moderate number of cores. However, exascale systems will spawn hundreds of thousands of threads in order to exploit their massive parallel architectures and thus conventional OS threads are too heavy for that purpose. Several lightweight thread (LWT) libraries have recently appeared offering lighter mechanisms to tackle massive concurrency. In order to examine the suitability of LWTs in high-level runtimes, we develop a set of microbenchmarks consisting of commonlyfound patterns in current parallel codes. Moreover, we study the semantics offered by some LWT libraries in order to expose the similarities between different LWT application programming interfaces. This study reveals that a reduced set of LWT functions can be sufficient to cover the common parallel code patterns and that those LWT libraries perform better than OS threads-based solutions in cases where task and nested parallelism are becoming more popular with new architectures.

  9. A systematic method for characterizing the time-range performance of ground penetrating radar

    International Nuclear Information System (INIS)

    Strange, A D

    2013-01-01

    The fundamental performance of ground penetrating radar (GPR) is linked to the ability to measure the signal time-of-flight in order to provide an accurate radar-to-target range estimate. Having knowledge of the actual time range and timing nonlinearities of a trace is therefore important when seeking to make quantitative range estimates. However, very few practical methods have been formally reported in the literature to characterize GPR time-range performance. This paper describes a method to accurately measure the true time range of a GPR to provide a quantitative assessment of the timing system performance and detect and quantify the effects of timing nonlinearity due to timing jitter. The effect of varying the number of samples per trace on the true time range has also been investigated and recommendations on how to minimize the effects of timing errors are described. The approach has been practically applied to characterize the timing performance of two commercial GPR systems. The importance of the method is that it provides the GPR community with a practical method to readily characterize the underlying accuracy of GPR systems. This in turn leads to enhanced target depth estimation as well as facilitating the accuracy of more sophisticated GPR signal processing methods. (paper)

  10. A new ART iterative method and a comparison of performance among various ART methods

    International Nuclear Information System (INIS)

    Tan, Yufeng; Sato, Shunsuke

    1993-01-01

    Many algebraic reconstruction techniques (ART) image reconstruction algorithms, for instance, simultaneous iterative reconstruction technique (SIRT), the relaxation method and multiplicative ART (MART), have been proposed and their convergent properties have been studied. SIRT and the underrelaxed relaxation method converge to the least-squares solution, but the convergent speeds are very slow. The Kaczmarz method converges very quickly, but the reconstructed images contain a lot of noise. The comparative studies between these algorithms have been done by Gilbert and others, but are not adequate. In this paper, we (1) propose a new method which is a modified Kaczmarz method and prove its convergence property, (2) study performance of 7 algorithms including the one proposed here by computer simulation for 3 kinds of typical phantoms. The method proposed here does not give the least-square solution, but the root mean square errors of its reconstructed images decrease very quickly after few interations. The result shows that the method proposed here gives a better reconstructed image. (author)

  11. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    Science.gov (United States)

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods

  12. Educational Accountability: A Qualitatively Driven Mixed-Methods Approach

    Science.gov (United States)

    Hall, Jori N.; Ryan, Katherine E.

    2011-01-01

    This article discusses the importance of mixed-methods research, in particular the value of qualitatively driven mixed-methods research for quantitatively driven domains like educational accountability. The article demonstrates the merits of qualitative thinking by describing a mixed-methods study that focuses on a middle school's system of…

  13. The large break LOCA evaluation method with the simplified statistic approach

    International Nuclear Information System (INIS)

    Kamata, Shinya; Kubo, Kazuo

    2004-01-01

    USNRC published the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology to large break LOCA which supported the revised rule for Emergency Core Cooling System performance in 1989. In USNRC regulatory guide 1.157, it is required that the peak cladding temperature (PCT) cannot exceed 2200deg F with high probability 95th percentile. In recent years, overseas countries have developed statistical methodology and best estimate code with the model which can provide more realistic simulation for the phenomena based on the CSAU evaluation methodology. In order to calculate PCT probability distribution by Monte Carlo trials, there are approaches such as the response surface technique using polynomials, the order statistics method, etc. For the purpose of performing rational statistic analysis, Mitsubishi Heavy Industries, LTD (MHI) tried to develop the statistic LOCA method using the best estimate LOCA code MCOBRA/TRAC and the simplified code HOTSPOT. HOTSPOT is a Monte Carlo heat conduction solver to evaluate the uncertainties of the significant fuel parameters at the PCT positions of the hot rod. The direct uncertainty sensitivity studies can be performed without the response surface because the Monte Carlo simulation for key parameters can be performed in short time using HOTSPOT. With regard to the parameter uncertainties, MHI established the treatment that the bounding conditions are given for LOCA boundary and plant initial conditions, the Monte Carlo simulation using HOTSPOT is applied to the significant fuel parameters. The paper describes the large break LOCA evaluation method with the simplified statistic approach and the results of the application of the method to the representative four-loop nuclear power plant. (author)

  14. Variable Pitch Approach for Performance Improving of Straight-Bladed VAWT at Rated Tip Speed Ratio

    Directory of Open Access Journals (Sweden)

    Zhenzhou Zhao

    2018-06-01

    Full Text Available This paper presents a new variable pitch (VP approach to increase the peak power coefficient of the straight-bladed vertical-axis wind turbine (VAWT, by widening the azimuthal angle band of the blade with the highest aerodynamic torque, instead of increasing the highest torque. The new VP-approach provides a curve of pitch angle designed for the blade operating at the rated tip speed ratio (TSR corresponding to the peak power coefficient of the fixed pitch (FP-VAWT. The effects of the new approach are exploited by using the double multiple stream tubes (DMST model and Prandtl’s mathematics to evaluate the blade tip loss. The research describes the effects from six aspects, including the lift, drag, angle of attack (AoA, resultant velocity, torque, and power output, through a comparison between VP-VAWTs and FP-VAWTs working at four TSRs: 4, 4.5, 5, and 5.5. Compared with the FP-blade, the VP-blade has a wider azimuthal zone with the maximum AoA, lift, drag, and torque in the upwind half-cycle, and yields the two new larger maximum values in the downwind half-cycle. The power distribution in the swept area of the turbine changes from an arched shape of the FP-VAWT into the rectangular shape of the VP-VAWT. The new VP-approach markedly widens the highest-performance zone of the blade in a revolution, and ultimately achieves an 18.9% growth of the peak power coefficient of the VAWT at the optimum TSR. Besides achieving this growth, the new pitching method will enhance the performance at TSRs that are higher than current optimal values, and an increase of torque is also generated.

  15. Combining multiple FDG-PET radiotherapy target segmentation methods to reduce the effect of variable performance of individual segmentation methods

    Energy Technology Data Exchange (ETDEWEB)

    McGurk, Ross J. [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Bowsher, James; Das, Shiva K. [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Lee, John A [Molecular Imaging and Experimental Radiotherapy Unit, Universite Catholique de Louvain, 1200 Brussels (Belgium)

    2013-04-15

    Purpose: Many approaches have been proposed to segment high uptake objects in 18F-fluoro-deoxy-glucose positron emission tomography images but none provides consistent performance across the large variety of imaging situations. This study investigates the use of two methods of combining individual segmentation methods to reduce the impact of inconsistent performance of the individual methods: simple majority voting and probabilistic estimation. Methods: The National Electrical Manufacturers Association image quality phantom containing five glass spheres with diameters 13-37 mm and two irregularly shaped volumes (16 and 32 cc) formed by deforming high-density polyethylene bottles in a hot water bath were filled with 18-fluoro-deoxyglucose and iodine contrast agent. Repeated 5-min positron emission tomography (PET) images were acquired at 4:1 and 8:1 object-to-background contrasts for spherical objects and 4.5:1 and 9:1 for irregular objects. Five individual methods were used to segment each object: 40% thresholding, adaptive thresholding, k-means clustering, seeded region-growing, and a gradient based method. Volumes were combined using a majority vote (MJV) or Simultaneous Truth And Performance Level Estimate (STAPLE) method. Accuracy of segmentations relative to CT ground truth volumes were assessed using the Dice similarity coefficient (DSC) and the symmetric mean absolute surface distances (SMASDs). Results: MJV had median DSC values of 0.886 and 0.875; and SMASD of 0.52 and 0.71 mm for spheres and irregular shapes, respectively. STAPLE provided similar results with median DSC of 0.886 and 0.871; and median SMASD of 0.50 and 0.72 mm for spheres and irregular shapes, respectively. STAPLE had significantly higher DSC and lower SMASD values than MJV for spheres (DSC, p < 0.0001; SMASD, p= 0.0101) but MJV had significantly higher DSC and lower SMASD values compared to STAPLE for irregular shapes (DSC, p < 0.0001; SMASD, p= 0.0027). DSC was not significantly

  16. Effect of the sequence data deluge on the performance of methods for detecting protein functional residues.

    Science.gov (United States)

    Garrido-Martín, Diego; Pazos, Florencio

    2018-02-27

    The exponential accumulation of new sequences in public databases is expected to improve the performance of all the approaches for predicting protein structural and functional features. Nevertheless, this was never assessed or quantified for some widely used methodologies, such as those aimed at detecting functional sites and functional subfamilies in protein multiple sequence alignments. Using raw protein sequences as only input, these approaches can detect fully conserved positions, as well as those with a family-dependent conservation pattern. Both types of residues are routinely used as predictors of functional sites and, consequently, understanding how the sequence content of the databases affects them is relevant and timely. In this work we evaluate how the growth and change with time in the content of sequence databases affect five sequence-based approaches for detecting functional sites and subfamilies. We do that by recreating historical versions of the multiple sequence alignments that would have been obtained in the past based on the database contents at different time points, covering a period of 20 years. Applying the methods to these historical alignments allows quantifying the temporal variation in their performance. Our results show that the number of families to which these methods can be applied sharply increases with time, while their ability to detect potentially functional residues remains almost constant. These results are informative for the methods' developers and final users, and may have implications in the design of new sequencing initiatives.

  17. Advanced nuclear power plant regulation using risk-informed and performance-based methods

    International Nuclear Information System (INIS)

    Modarres, Mohammad

    2009-01-01

    This paper proposes and discusses implications of a largely probabilistic regulatory framework using best-estimate, goal-driven, risk-informed, and performance-based methods. This framework relies on continuous probabilistic assessment of performance of a set of time-dependent, safety-critical systems, structures, components, and procedures that assure attainment of a broad set of overarching technology-neutral protective, mitigative, and preventive goals under all phases of plant operations. In this framework acceptable levels of performance are set through formal apportionment so that they are commensurate with the overarching goals. Regulatory acceptance would be the based on the confidence level with which the plant conforms to these goals and performance objectives. The proposed framework uses the traditional defense-in-depth design and operation regulatory philosophy when uncertainty in conforming to specific goals and objectives is high. Finally, the paper discusses the steps needed to develop a corresponding technology-neutral regulatory approach from the proposed framework

  18. Staff Performance Analysis: A Method for Identifying Brigade Staff Tasks

    National Research Council Canada - National Science Library

    Ford, Laura

    1997-01-01

    ... members of conventional mounted brigade staff. Initial analysis of performance requirements in existing documentation revealed that the performance specifications were not sufficiently detailed for brigade battle staffs...

  19. Framework for benchmarking online retailing performance using fuzzy AHP and TOPSIS method

    Directory of Open Access Journals (Sweden)

    M. Ahsan Akhtar Hasin

    2012-08-01

    Full Text Available Due to increasing penetration of internet connectivity, on-line retail is growing from the pioneer phase to increasing integration within people's lives and companies' normal business practices. In the increasingly competitive environment, on-line retail service providers require systematic and structured approach to have cutting edge over the rival. Thus, the use of benchmarking has become indispensable to accomplish superior performance to support the on-line retail service providers. This paper uses the fuzzy analytic hierarchy process (FAHP approach to support a generic on-line retail benchmarking process. Critical success factors for on-line retail service have been identified from a structured questionnaire and literature and prioritized using fuzzy AHP. Using these critical success factors, performance levels of the ORENET an on-line retail service provider is benchmarked along with four other on-line service providers using TOPSIS method. Based on the benchmark, their relative ranking has also been illustrated.

  20. Interdisciplinary Approaches and Methods for Sustainable Transformation and Innovation

    Directory of Open Access Journals (Sweden)

    Sangkyun Kim

    2015-04-01

    Full Text Available To increase the likelihood of success and sustainability, organizations must fundamentally reposition themselves and try to change current processes or create new products and services. One of the most effective approaches to find a solution for transformation and innovation is to learn from other domains where a solution for similar problems is already available. This paper briefly introduces the definition of and approaches to convergence of academic disciplines and industries, and overviews several representative convergence cases focusing on gamification for sustainable education, environments, and business managements.

  1. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    Science.gov (United States)

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  2. A Dynamic Fuzzy Approach Based on the EDAS Method for Multi-Criteria Subcontractor Evaluation

    Directory of Open Access Journals (Sweden)

    Mehdi Keshavarz-Ghorabaee

    2018-03-01

    Full Text Available Selection of appropriate subcontractors for outsourcing is very important for the success of construction projects. This can improve the overall quality of projects and promote the qualification and reputation of the main contractors. The evaluation of subcontractors can be made by some experts or decision-makers with respect to some criteria. If this process is done in different time periods, it can be defined as a dynamic multi-criteria group decision-making (MCGDM problem. In this study, we propose a new fuzzy dynamic MCGDM approach based on the EDAS (Evaluation based on Distance from Average Solution method for subcontractor evaluation. In the procedure of the proposed approach, the sets of alternatives, criteria and decision-makers can be changed at different time periods. Also, the proposed approach gives more weight to newer decision information for aggregating the overall performance of alternatives. A numerical example is used to illustrate the proposed approach and show the application of it in subcontractor evaluation. The results demonstrate that the proposed approach is efficient and useful in real-world decision-making problems.

  3. A METHOD TO ESTIMATE TEMPORAL INTERACTION IN A CONDITIONAL RANDOM FIELD BASED APPROACH FOR CROP RECOGNITION

    Directory of Open Access Journals (Sweden)

    P. M. A. Diaz

    2016-06-01

    Full Text Available This paper presents a method to estimate the temporal interaction in a Conditional Random Field (CRF based approach for crop recognition from multitemporal remote sensing image sequences. This approach models the phenology of different crop types as a CRF. Interaction potentials are assumed to depend only on the class labels of an image site at two consecutive epochs. In the proposed method, the estimation of temporal interaction parameters is considered as an optimization problem, whose goal is to find the transition matrix that maximizes the CRF performance, upon a set of labelled data. The objective functions underlying the optimization procedure can be formulated in terms of different accuracy metrics, such as overall and average class accuracy per crop or phenological stages. To validate the proposed approach, experiments were carried out upon a dataset consisting of 12 co-registered LANDSAT images of a region in southeast of Brazil. Pattern Search was used as the optimization algorithm. The experimental results demonstrated that the proposed method was able to substantially outperform estimates related to joint or conditional class transition probabilities, which rely on training samples.

  4. Ensemble approach combining multiple methods improves human transcription start site prediction

    LENUS (Irish Health Repository)

    Dineen, David G

    2010-11-30

    Abstract Background The computational prediction of transcription start sites is an important unsolved problem. Some recent progress has been made, but many promoters, particularly those not associated with CpG islands, are still difficult to locate using current methods. These methods use different features and training sets, along with a variety of machine learning techniques and result in different prediction sets. Results We demonstrate the heterogeneity of current prediction sets, and take advantage of this heterogeneity to construct a two-level classifier (\\'Profisi Ensemble\\') using predictions from 7 programs, along with 2 other data sources. Support vector machines using \\'full\\' and \\'reduced\\' data sets are combined in an either\\/or approach. We achieve a 14% increase in performance over the current state-of-the-art, as benchmarked by a third-party tool. Conclusions Supervised learning methods are a useful way to combine predictions from diverse sources.

  5. Ab initio molecular dynamics, iterative methods and multiscale approaches in electronic structure calculations

    International Nuclear Information System (INIS)

    Bernholc, J.

    1998-01-01

    The field of computational materials physics has grown very quickly in the past decade, and it is now possible to simulate properties of complex materials completely from first principles. The presentation has mostly focused on first-principles dynamic simulations. Such simulations have been pioneered by Car and Parrinello, who introduced a method for performing realistic simulations within the context of density functional theory. The Car-Parrinello method and related plane wave approaches are reviewed in depth. The Car-Parrinello method was reviewed and illustrated with several applications: the dynamics of the C 60 solid, diffusion across Si steps, and computing free energy differences. Alternative ab initio simulation schemes, which use preconditioned conjugate gradient techniques for energy minimization and dynamics were also discussed

  6. Explaining the Evolution of Performance Measures - A Dual Case-Study Approach

    Directory of Open Access Journals (Sweden)

    Mohammed Salloum

    2013-07-01

    Full Text Available Few empirical studies have examined how performance measures change in practice and the driving forces behind this change. The existing body of literature has taken a prescriptive approach to how managers and organisations ought to manage change in performance measures without any concern for studying the phenomenon itself and thus a theoretical gap exists. With this gap in mind, the purpose of this paper is to outline how and why the performance measures have changed at two case companies over the time period 2008-2011. In order to fulfil the purpose of this paper two case studies at two different case companies have been conducted. The choice of data collection method is justified by the ambition to attain an in-depth and holistic understanding of the phenomenon. For each case, the data collection was based on four components: an interview study, analysis of archived data, documentation and direct observations. In total, 28 interviews were conducted, 14 at each case company. The empirical findings exhibit that the performance measures are exposed to continuous and considerable change from several perspectives. The measurement scopes at both case companies are steadily expanding, the individual performance measures are constantly replaced and their characteristics are continuously altered. An array of change triggers has been identified in the empirical findings. In contrast to what is advocated in literature, the findings illustrate that the most frequent reason for change is the will to improve the performance measures, the measurement process and the overall performance rather than changing internal and external environments. There are several challenges that need to be addressed in the future research agenda.

  7. Performance-Based Financing to Strengthen the Health System in Benin: Challenging the Mainstream Approach

    Directory of Open Access Journals (Sweden)

    Elisabeth Paul

    2018-01-01

    Full Text Available Background Performance-based financing (PBF is often proposed as a way to improve health system performance. In Benin, PBF was launched in 2012 through a World Bank-supported project. The Belgian Development Agency (BTC followed suit through a health system strengthening (HSS project. This paper analyses and draws lessons from the experience of BTC-supported PBF alternative approach – especially with regards to institutional aspects, the role of demand-side actors, ownership, and cost-effectiveness – and explores the mechanisms at stake so as to better understand how the “PBF package” functions and produces effects. Methods An exploratory, theory-driven evaluation approach was adopted. Causal mechanisms through which PBF is hypothesised to impact on results were singled out and explored. This paper stems from the co-authors’ capitalisation of experiences; mixed methods were used to collect, triangulate and analyse information. Results are structured along Witter et al framework. Results Influence of context is strong over PBF in Benin; the policy is donor-driven. BTC did not adopt the World Bank’s mainstream PBF model, but developed an alternative approach in line with its HSS support programme, which is grounded on existing domestic institutions. The main features of this approach are described (decentralised governance, peer review verification, counter-verification entrusted to health service users’ platforms, as well as its adaptive process. PBF has contributed to strengthen various aspects of the health system and led to modest progress in utilisation of health services, but noticeable improvements in healthcare quality. Three mechanisms explaining observed outcomes within the context are described: comprehensive HSS at district level; acting on health workers’ motivation through a complex package of incentives; and increased accountability by reinforcing dialogue with demand-side actors. Cost-effectiveness and

  8. Transactional approach in assessment of operational performance of companies in transport infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Dubrovsky, V.; Yaroshevich, N.; Kuzmin, E.

    2016-07-01

    Offer an alternative method to assess operational performance of companies in transport infrastructure of a region by making a comparison between transaction costs. The method is supposed to be a cross-functional and possibly applied to an analysis of economic entities of a different order (country, region, sector, companies) while evaluating “viscosity” / complexity of the outside and the inside. The paper includes an analysis of various methodological approaches to assess a development level of the transport infrastructure in a region. Within the author's approach and for purposed of the research, an index of transaction capacity or the transactionalness index is proposed, which determines a level of transaction costs calculated against the cost of production and revenue. The approach is piloted using the region-wise consolidated financial data of companies involved in the Russian transport infrastructure for 2005/2013. The proposed alternative way to measure corporate operating efficiency has proved its academic consistency. A specific comparison between the transaction costs using the transactionalness index allows first to identify companies or regions/sectors, where there is excess complexity of economical communication in bargaining. Secondly, the index does not only point out indirectly to a degree of development in the institutional environment, but also the infrastructure (the transport one in the example given). Third, the transactionalness level may say of uncertainty and risks. As an addition to theoretical and methodological aspects of transaction costs, the authors justify an approach to their size estimation, as well as their differentiation dividing them into two groups: those of a natural type and a background type. In a course of their discussion, the authors have concluded that there are such transaction costs in place, which are standard in a manner of speaking. There is a discussion whether it is scientifically reasonable to use an

  9. Analytical methods for prefiltering of close approaches between ...

    African Journals Online (AJOL)

    user

    2010-02-10

    Feb 10, 2010 ... find out the close approach for all objects with simulations. ... the operational satellite and other orbiting objects. ... Recently, space scientists all over the Globe are giving much ... avoidances (Alarcon-Rodriguez et al., 2004, Gronchi, 2005 and Choi et al., 2009) for the stability of future Low Earth Orbit (LEO).

  10. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  11. An Integrated MCDM Method in Ranking BSC Perspectives and key Performance Indicators (KPIs

    Directory of Open Access Journals (Sweden)

    Mohsen Alvandi

    2012-04-01

    Full Text Available The balanced scorecard (BSC approach is an effective technique for performance evaluation. BSC can better reflect the dependence and feedback problems of each factor in real world situations. This study aims at developing a set of appropriate key performance indicators according to (BSC approach for SAPCO using multiple criteria decision making(MCDM method. We provide key performance indicators through literature reviews and experts' idea in SAPCO, which is one of the biggest vehicle spare suppliers in Iran. The proposed study uses decision making trial and evaluation laboratory (DEMATEL and analytic network process (ANP, respectively to measure the casual relationship between the perspectives as well as the relative weights. The results based on ANP method shows that ‘‘Customer’’ is the most influential factor. In addition, internal process, financial and learning and growth are in two to four positions. Three important key performance indicators are as bellow: Total price of parts, Customer satisfaction and Lack of parts in production.

  12. Analysis of international approaches which are used at development of theoperational safety performance indicators

    International Nuclear Information System (INIS)

    Lyigots'kij, O.Yi.; Nosovs'kij, A.V.; Chemeris, Yi.O.

    2009-01-01

    Description of international approaches and experience of the use of theoperational safety performance indicators system is provided for estimationof current status and making a decision on corrections in the operationpractice. The state of development of the operational safety performanceindicators system by the operating organization is overviewed. Thepossibility of application of international approaches during development ofthe integral safety performance indicators system is analyzed. Aims and tasksof future researches are formulated in relation to development of theintegral safety performance indicators system.

  13. The Meaning of Anti-Americanism: A Performative Approach to Anti-American Prejudice

    Directory of Open Access Journals (Sweden)

    Felix Knappertsbusch

    2013-06-01

    Full Text Available A contribution to the ongoing debate on how anti-Americanism can be adequately conceptualized and how such prejudice can be distinguished from legitimate criticism, arguing that part of these conceptual problems arise from a too narrow focus on defining anti-Americanism and the use of standardized empirical operationalizations. Such approaches exhibit severe limitations in grasping the flexibility of the phenomenon in everyday discourse and often underestimate or ignore the interpretive aspect involved in identifying utterances as anti-American prejudice. Alternatively, a performative approach is proposed, understanding anti-Americanism as a network of speech acts bound by family resemblance rather than identical features. In combination with qualitative empirical research methods such a conceptualization is especially suited to account for the flexible, situated use of anti-American utterances. At the same time it grants reflexivity to the research concept, in the sense of a close description of the scientific application of the notion of anti-Americanism. Two empirical examples from an interview study on anti-American speech in Germany illustrate the potential of such an approach, providing an insight into how anti-Americanism is incorporated into the construction and expression of racist and revisionist national identifications in everyday discourse.

  14. Methods of the Development Strategy of Service Companies: Logistical Approach

    Science.gov (United States)

    Toymentseva, Irina A.; Karpova, Natalya P.; Toymentseva, Angelina A.; Chichkina, Vera D.; Efanov, Andrey V.

    2016-01-01

    The urgency of the analyzed issue is due to lack of attention of heads of service companies to the theory and methodology of strategic management, methods and models of management decision-making in times of economic instability. The purpose of the article is to develop theoretical positions and methodical recommendations on the formation of the…

  15. Mixed-methods approaches in health research in Nepal

    OpenAIRE

    Simkhada, Padam; Van Teijlingen, Edwin; Wasti, Sharada Prasad; Sathian, B.

    2014-01-01

    Combining and integrating a mixture of qualitative and quantitative methods in one single study is widely used in health and social care research in high-income countries. This editorial adds a few words of advice to the novice mixed-methods researcher in Nepal.

  16. Effects of Asphalt Mix Design Properties on Pavement Performance: A Mechanistic Approach

    Directory of Open Access Journals (Sweden)

    Ahmad M. Abu Abdo

    2016-01-01

    Full Text Available The main objective of this study was to investigate the effects of hot mix asphalt material properties on the performance of flexible pavements via mechanistic approach. 3D Move Analysis software was utilized to determine rutting and cracking distresses in an asphalt concrete (AC layer. Fourteen different Superpave mixes were evaluated by utilizing results of the Dynamic Modulus (|E⁎| Test and the Dynamic Shear Modulus (|G⁎| Test. Results showed that with the increase of binder content, the tendency of rutting in AC layer increased. However, with the increase of binder content, the cracking of AC layer lessened. Furthermore, when different binder grades were evaluated, results showed that with the increase of the upper binder grade number, rutting decreased, and with the increase of the lower binder grade number, rutting increased. Furthermore, analysis showed that with the increase of the lower binder grade number, higher percent of bottom up cracks would result. As a result of the analysis, binder grade should not be solely considered for cracking in AC layer; binder content and aggregate structure play a big role. Finally, results illustrated that the mechanistic approach is a better tool to determine the performance of asphalt pavement than commonly used methods.

  17. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  18. Simplified approaches to some nonoverlapping domain decomposition methods

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Jinchao

    1996-12-31

    An attempt will be made in this talk to present various domain decomposition methods in a way that is intuitively clear and technically coherent and concise. The basic framework used for analysis is the {open_quotes}parallel subspace correction{close_quotes} or {open_quotes}additive Schwarz{close_quotes} method, and other simple technical tools include {open_quotes}local-global{close_quotes} and {open_quotes}global-local{close_quotes} techniques, the formal one is for constructing subspace preconditioner based on a preconditioner on the whole space whereas the later one for constructing preconditioner on the whole space based on a subspace preconditioner. The domain decomposition methods discussed in this talk fall into two major categories: one, based on local Dirichlet problems, is related to the {open_quotes}substructuring method{close_quotes} and the other, based on local Neumann problems, is related to the {open_quotes}Neumann-Neumann method{close_quotes} and {open_quotes}balancing method{close_quotes}. All these methods will be presented in a systematic and coherent manner and the analysis for both two and three dimensional cases are carried out simultaneously. In particular, some intimate relationship between these algorithms are observed and some new variants of the algorithms are obtained.

  19. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Je Hyun; Shim, Chang Ho [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of); Kim, Sung Hyun [Nuclear Fuel Cycle Waste Treatment Research Division, Research Reactor Institute, Kyoto University, Osaka (Japan); Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo [Ionizing Radiation Center, Nuclear Fuel Cycle Waste Treatment Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho [Ionizing Radiation Center, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2016-12-15

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers.

  20. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    International Nuclear Information System (INIS)

    Kim, Je Hyun; Shim, Chang Ho; Kim, Sung Hyun; Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo; Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho

    2016-01-01

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers

  1. Applying a life cycle approach to project management methods

    OpenAIRE

    Biggins, David; Trollsund, F.; Høiby, A.L.

    2016-01-01

    Project management is increasingly important to organisations because projects are the method\\ud by which organisations respond to their environment. A key element within project management\\ud is the standards and methods that are used to control and conduct projects, collectively known as\\ud project management methods (PMMs) and exemplified by PRINCE2, the Project Management\\ud Institute’s and the Association for Project Management’s Bodies of Knowledge (PMBOK and\\ud APMBOK. The purpose of t...

  2. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    Science.gov (United States)

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  3. Review Of Mechanistic Understanding And Modeling And Uncertainty Analysis Methods For Predicting Cementitious Barrier Performance

    International Nuclear Information System (INIS)

    Langton, C.; Kosson, D.

    2009-01-01

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focus of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various

  4. REVIEW OF MECHANISTIC UNDERSTANDING AND MODELING AND UNCERTAINTY ANALYSIS METHODS FOR PREDICTING CEMENTITIOUS BARRIER PERFORMANCE

    Energy Technology Data Exchange (ETDEWEB)

    Langton, C.; Kosson, D.

    2009-11-30

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focus of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various

  5. Three-dimensional vision enhances task performance independently of the surgical method.

    Science.gov (United States)

    Wagner, O J; Hagen, M; Kurmann, A; Horgan, S; Candinas, D; Vorburger, S A

    2012-10-01

    Within the next few years, the medical industry will launch increasingly affordable three-dimensional (3D) vision systems for the operating room (OR). This study aimed to evaluate the effect of two-dimensional (2D) and 3D visualization on surgical skills and task performance. In this study, 34 individuals with varying laparoscopic experience (18 inexperienced individuals) performed three tasks to test spatial relationships, grasping and positioning, dexterity, precision, and hand-eye and hand-hand coordination. Each task was performed in 3D using binocular vision for open performance, the Viking 3Di Vision System for laparoscopic performance, and the DaVinci robotic system. The same tasks were repeated in 2D using an eye patch for monocular vision, conventional laparoscopy, and the DaVinci robotic system. Loss of 3D vision significantly increased the perceived difficulty of a task and the time required to perform it, independently of the approach (P robot than with laparoscopy (P = 0.005). In every case, 3D robotic performance was superior to conventional laparoscopy (2D) (P < 0.001-0.015). The more complex the task, the more 3D vision accelerates task completion compared with 2D vision. The gain in task performance is independent of the surgical method.

  6. A taxonomy of behaviour change methods: an Intervention Mapping approach

    OpenAIRE

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fern?ndez, Mar?a E.; Markham, Christine; Bartholomew, L. Kay

    2015-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters fo...

  7. Comparative performance of different stochastic methods to simulate drug exposure and variability in a population.

    Science.gov (United States)

    Tam, Vincent H; Kabbara, Samer

    2006-10-01

    Monte Carlo simulations (MCSs) are increasingly being used to predict the pharmacokinetic variability of antimicrobials in a population. However, various MCS approaches may differ in the accuracy of the predictions. We compared the performance of 3 different MCS approaches using a data set with known parameter values and dispersion. Ten concentration-time profiles were randomly generated and used to determine the best-fit parameter estimates. Three MCS methods were subsequently used to simulate the AUC(0-infinity) of the population, using the central tendency and dispersion of the following in the subject sample: 1) K and V; 2) clearance and V; 3) AUC(0-infinity). In each scenario, 10000 subject simulations were performed. Compared to true AUC(0-infinity) of the population, mean biases by various methods were 1) 58.4, 2) 380.7, and 3) 12.5 mg h L(-1), respectively. Our results suggest that the most realistic MCS approach appeared to be based on the variability of AUC(0-infinity) in the subject sample.

  8. Performance Analysis of Cyber Security Awareness Delivery Methods

    Science.gov (United States)

    Abawajy, Jemal; Kim, Tai-Hoon

    In order to decrease information security threats caused by human-related vulnerabilities, an increased concentration on information security awareness and training is necessary. There are numerous information security awareness training delivery methods. The purpose of this study was to determine what delivery method is most successful in providing security awareness training. We conducted security awareness training using various delivery methods such as text based, game based and a short video presentation with the aim of determining user preference delivery methods. Our study suggests that a combined delvery methods are better than individual secrity awareness delivery method.

  9. Measuring economy-wide energy efficiency performance: A parametric frontier approach

    International Nuclear Information System (INIS)

    Zhou, P.; Ang, B.W.; Zhou, D.Q.

    2012-01-01

    This paper proposes a parametric frontier approach to estimating economy-wide energy efficiency performance from a production efficiency point of view. It uses the Shephard energy distance function to define an energy efficiency index and adopts the stochastic frontier analysis technique to estimate the index. A case study of measuring the economy-wide energy efficiency performance of a sample of OECD countries using the proposed approach is presented. It is found that the proposed parametric frontier approach has higher discriminating power in energy efficiency performance measurement compared to its nonparametric frontier counterparts.

  10. Hourly forecasting of global solar radiation based on multiscale decomposition methods: A hybrid approach

    International Nuclear Information System (INIS)

    Monjoly, Stéphanie; André, Maïna; Calif, Rudy; Soubdhan, Ted

    2017-01-01

    This paper introduces a new approach for the forecasting of solar radiation series at 1 h ahead. We investigated on several techniques of multiscale decomposition of clear sky index K_c data such as Empirical Mode Decomposition (EMD), Ensemble Empirical Mode Decomposition (EEMD) and Wavelet Decomposition. From these differents methods, we built 11 decomposition components and 1 residu signal presenting different time scales. We performed classic forecasting models based on linear method (Autoregressive process AR) and a non linear method (Neural Network model). The choice of forecasting method is adaptative on the characteristic of each component. Hence, we proposed a modeling process which is built from a hybrid structure according to the defined flowchart. An analysis of predictive performances for solar forecasting from the different multiscale decompositions and forecast models is presented. From multiscale decomposition, the solar forecast accuracy is significantly improved, particularly using the wavelet decomposition method. Moreover, multistep forecasting with the proposed hybrid method resulted in additional improvement. For example, in terms of RMSE error, the obtained forecasting with the classical NN model is about 25.86%, this error decrease to 16.91% with the EMD-Hybrid Model, 14.06% with the EEMD-Hybid model and to 7.86% with the WD-Hybrid Model. - Highlights: • Hourly forecasting of GHI in tropical climate with many cloud formation processes. • Clear sky Index decomposition using three multiscale decomposition methods. • Combination of multiscale decomposition methods with AR-NN models to predict GHI. • Comparison of the proposed hybrid model with the classical models (AR, NN). • Best results using Wavelet-Hybrid model in comparison with classical models.

  11. The equivalent energy method: an engineering approach to fracture

    International Nuclear Information System (INIS)

    Witt, F.J.

    1981-01-01

    The equivalent energy method for elastic-plastic fracture evaluations was developed around 1970 for determining realistic engineering estimates for the maximum load-displacement or stress-strain conditions for fracture of flawed structures. The basis principles were summarized but the supporting experimental data, most of which were obtained after the method was proposed, have never been collated. This paper restates the original bases more explicitly and presents the validating data in graphical form. Extensive references are given. The volumetric energy ratio, a modelling parameter encompassing both size and temperature, is the fundamental parameter of the equivalent energy method. It is demonstrated that, in an engineering sense, the volumetric energy ratio is a unique material characteristic for a steel, much like a material property except size must be taken into account. With this as a proposition, the basic formula of the equivalent energy method is derived. Sufficient information is presented so that investigators and analysts may judge the viability and applicability of the method to their areas of interest. (author)

  12. THE EFFECTIVENESS OF PRUDENTIAL BANKING SUPERVISION: PECULIARITIES OF METHODICAL APPROACHES

    Directory of Open Access Journals (Sweden)

    S. Naumenkova

    2015-10-01

    Full Text Available Іn the article the theoretical fundamentals of the prudential banking supervision effectiveness and substantiation of approaches to calculation of the integral indicator of supervisory system compliance with the Basel Committee Core Principles were investigated. The “functional effectiveness” and “institutional effectiveness” concepts of supervisory activity were suggested. The authors have defined the influence of supervisory organizing structure on GDP growth by groups of countries in the world. The list of priority measures focused on increase of the effectiveness of prudential supervisory activity was systematized to restore sustainability of the national banking sector.

  13. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    Science.gov (United States)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  14. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  15. An approximate methods approach to probabilistic structural analysis

    Science.gov (United States)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  16. Improvement Methods in NPP's Radiation Emergency Plan: An Administrative Approach

    International Nuclear Information System (INIS)

    Lee, Yoon Wook; Yang, He Sun

    2009-01-01

    The Radiation Emergency Plan (REP) can be divided into a technical and an administrative responses. The domestic NPP's REPs are reviewed from the viewpoint of the administrative response and improvement methods are also suggested in this treatise. The fields of the reviews are the composition of the emergency response organizations, the activation criteria of the organizations, the selection of the staffings and the reasonableness of the REP's volume. In addition, the limitations of the current radiation exercises are reviewed and the improvement method of the exercise is presented. It is expected that the suggested recommendations will be helpful in establishing useful REPs and making practical radiation exercises in Korea

  17. A novel time series link prediction method: Learning automata approach

    Science.gov (United States)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  18. Simulating elastic light scattering using high performance computing methods

    NARCIS (Netherlands)

    Hoekstra, A.G.; Sloot, P.M.A.; Verbraeck, A.; Kerckhoffs, E.J.H.

    1993-01-01

    The Coupled Dipole method, as originally formulated byPurcell and Pennypacker, is a very powerful method tosimulate the Elastic Light Scattering from arbitraryparticles. This method, which is a particle simulationmodel for Computational Electromagnetics, has one majordrawback: if the size of the

  19. A new approach to enhance the performance of decision tree for classifying gene expression data.

    Science.gov (United States)

    Hassan, Md; Kotagiri, Ramamohanarao

    2013-12-20

    Gene expression data classification is a challenging task due to the large dimensionality and very small number of samples. Decision tree is one of the popular machine learning approaches to address such classification problems. However, the existing decision tree algorithms use a single gene feature at each node to split the data into its child nodes and hence might suffer from poor performance specially when classifying gene expression dataset. By using a new decision tree algorithm where, each node of the tree consists of more than one gene, we enhance the classification performance of traditional decision tree classifiers. Our method selects suitable genes that are combined using a linear function to form a derived composite feature. To determine the structure of the tree we use the area under the Receiver Operating Characteristics curve (AUC). Experimental analysis demonstrates higher classification accuracy using the new decision tree compared to the other existing decision trees in literature. We experimentally compare the effect of our scheme against other well known decision tree techniques. Experiments show that our algorithm can substantially boost the classification performance of the decision tree.

  20. Methodical approaches to solving special problems of testing. Seminar papers

    International Nuclear Information System (INIS)

    1996-01-01

    This Seminar volume introduces concepts and applications from different areas of application of ultrasonic testing and other non-destructive test methods in 18 lectures, in order to give an idea of new trends in development and stimuli for special solutions to problems. 3 articles were recorded separately for the ENERGY data bank. (orig./MM) [de

  1. An Integrated Approach to Research Methods and Capstone

    Science.gov (United States)

    Postic, Robert; McCandless, Ray; Stewart, Beth

    2014-01-01

    In 1991, the AACU issued a report on improving undergraduate education suggesting, in part, that a curriculum should be both comprehensive and cohesive. Since 2008, we have systematically integrated our research methods course with our capstone course in an attempt to accomplish the twin goals of comprehensiveness and cohesion. By taking this…

  2. Perceptual-cognitive expertise in sport: some considerations when applying the expert performance approach.

    Science.gov (United States)

    Williams, A Mark; Ericsson, K Anders

    2005-06-01

    The number of researchers studying perceptual-cognitive expertise in sport is increasing. The intention in this paper is to review the currently accepted framework for studying expert performance and to consider implications for undertaking research work in the area of perceptual-cognitive expertise in sport. The expert performance approach presents a descriptive and inductive approach for the systematic study of expert performance. The nature of expert performance is initially captured in the laboratory using representative tasks that identify reliably superior performance. Process-tracing measures are employed to determine the mechanisms that mediate expert performance on the task. Finally, the specific types of activities that lead to the acquisition and development of these mediating mechanisms are identified. General principles and mechanisms may be discovered and then validated by more traditional experimental designs. The relevance of this approach to the study of perceptual-cognitive expertise in sport is discussed and suggestions for future work highlighted.

  3. Prediction of polymer flooding performance using an analytical method

    International Nuclear Information System (INIS)

    Tan Czek Hoong; Mariyamni Awang; Foo Kok Wai

    2001-01-01

    The study investigated the applicability of an analytical method developed by El-Khatib in polymer flooding. Results from a simulator UTCHEM and experiments were compared with the El-Khatib prediction method. In general, by assuming a constant viscosity polymer injection, the method gave much higher recovery values than the simulation runs and the experiments. A modification of the method gave better correlation, albeit only oil production. Investigation is continuing on modifying the method so that a better overall fit can be obtained for polymer flooding. (Author)

  4. Analysis of ECT Synchronization Performance Based on Different Interpolation Methods

    Directory of Open Access Journals (Sweden)

    Yang Zhixin

    2014-01-01

    Full Text Available There are two synchronization methods of electronic transformer in IEC60044-8 standard: impulsive synchronization and interpolation. When the impulsive synchronization method is inapplicability, the data synchronization of electronic transformer can be realized by using the interpolation method. The typical interpolation methods are piecewise linear interpolation, quadratic interpolation, cubic spline interpolation and so on. In this paper, the influences of piecewise linear interpolation, quadratic interpolation and cubic spline interpolation for the data synchronization of electronic transformer are computed, then the computational complexity, the synchronization precision, the reliability, the application range of different interpolation methods are analyzed and compared, which can serve as guide studies for practical applications.

  5. Determination of Settling Tanks Performance Using an Eulerian- Lagrangian Method

    OpenAIRE

    A Tamayol; B Firoozabadi; G Ahmadi

    2008-01-01

    Circulation regions always exist in settling tanks. These regions reduce the tank’s performance and decrease its effective volume. The recirculation zones would result in short-circuiting and high flow mixing problems. The inlet position would also affect the size and location of the recirculation region. Using a proper baffle configuration could substantially increase the performance of the settling tanks. A common procedure for the comparison of the performances of diffe...

  6. A performance model for the communication in fast multipole methods on high-performance computing platforms

    KAUST Repository

    Ibeid, Huda

    2016-03-04

    Exascale systems are predicted to have approximately 1 billion cores, assuming gigahertz cores. Limitations on affordable network topologies for distributed memory systems of such massive scale bring new challenges to the currently dominant parallel programing model. Currently, there are many efforts to evaluate the hardware and software bottlenecks of exascale designs. It is therefore of interest to model application performance and to understand what changes need to be made to ensure extrapolated scalability. The fast multipole method (FMM) was originally developed for accelerating N-body problems in astrophysics and molecular dynamics but has recently been extended to a wider range of problems. Its high arithmetic intensity combined with its linear complexity and asynchronous communication patterns make it a promising algorithm for exascale systems. In this paper, we discuss the challenges for FMM on current parallel computers and future exascale architectures, with a focus on internode communication. We focus on the communication part only; the efficiency of the computational kernels are beyond the scope of the present study. We develop a performance model that considers the communication patterns of the FMM and observe a good match between our model and the actual communication time on four high-performance computing (HPC) systems, when latency, bandwidth, network topology, and multicore penalties are all taken into account. To our knowledge, this is the first formal characterization of internode communication in FMM that validates the model against actual measurements of communication time. The ultimate communication model is predictive in an absolute sense; however, on complex systems, this objective is often out of reach or of a difficulty out of proportion to its benefit when there exists a simpler model that is inexpensive and sufficient to guide coding decisions leading to improved scaling. The current model provides such guidance.

  7. A Neuro-Fuzzy Approach in the Classification of Students’ Academic Performance

    Directory of Open Access Journals (Sweden)

    Quang Hung Do

    2013-01-01

    Full Text Available Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions.

  8. An analysis of clinical transition stresses experienced by dental students: A qualitative methods approach.

    Science.gov (United States)

    Botelho, M; Gao, X; Bhuyan, S Y

    2018-04-17

    Stress in dental students is well established with potential psychological distress, emotional exhaustion and burnout-related symptoms. Little attention has been given to the problems encountered by dental students during the transition from theoretical or paraclinical training to the clinical environment. The aim of this study was to adopt a qualitative research methods approach to understand the perceived stressors during students' clinical transition and provide insights for curriculum planners to enhance learning. This study analysed four groups of 2nd- and 3rd-year BDS students' experiences in focus group interviews relating to their pre-clinical and clinical transitions. The interviews were recorded and transcribed verbatim, and a thematic analysis was performed using an inductive qualitative approach. Key overlapping domains identified were the transition gap and stresses. The transition gap was subclassified into knowledge and skill (hard and soft), and stresses was subcategorised into internal and external stresses. On first coming to clinics, students experienced knowledge gaps of unfamiliar clinical treatments with mismatches between knowledge acquisition and clinical exposure. Students felt incompetent owing to the stresses attributable to curriculum design, staff and the patient. This negatively affected their confidence and clinical performance. A range of challenges have been identified that will allow curriculum designer's to plan a more supportive learning experience to help students during their transition to clinical practice giving them timely knowledge, confidence and clinical performance to better prepare them for entering clinics. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. A blended learning approach to teaching sociolinguistic research methods

    Directory of Open Access Journals (Sweden)

    Olivier, Jako

    2014-12-01

    Full Text Available This article reports on the use of Wiktionary, an open source online dictionary, as well as generic wiki pages within a university’s e-learning environment as teaching and learning resources in an Afrikaans sociolinguistics module. In a communal constructivist manner students learnt, but also constructed learning content. From the qualitative research conducted with students it is clear that wikis provide for effective facilitation of a blended learning approach to sociolinguistic research. The use of this medium was positively received, however, some students did prefer handing in assignments in hard copy. The issues of computer literacy and access to the internet were also raised by the respondents. The use of wikis and Wiktionary prompted useful unplanned discussions around reliability and quality of public wikis. The use of a public wiki such as Wiktionary served as encouragement for students as they were able to contribute to the promotion of Afrikaans in this way.

  10. Economic Sustainability in International Business: Peculiarities, Methods and Approaches

    Directory of Open Access Journals (Sweden)

    Otenko Iryna Pavlivna

    2016-05-01

    Full Text Available This article is intended as a contribution to the ongoing analysis of economic sustainability in international business. This study is presented with a view toward further understanding and agreement of the key concepts of sustainability. Approaches to sustainability are considered, important benchmarks and essential characteristics of sustainable development in international business are included. The article demonstrates how the concept of economic sustainability can be applied to the business level. The main ideas of the most widespread concepts on resource management are presented. Incorporation of ESG and financial factors in the concept of sustainable investing is considered. Emissions that are responsible for climate change, namely top emitters, key issues and figures are presented.

  11. The performance regulatory approach in quality assurance: Its application to safety in nuclear power plants

    International Nuclear Information System (INIS)

    Sajaroff, Pedro M.

    2000-01-01

    In early 1991, the IAEA assembled an Advisory Group on the Comprehensive Revision of the Code and the Safety on Quality Assurance of the NUSS Programme. The Group was made up by specialists from a number of countries and from ISO, FORATOM, the EC and the IAEA itself, and its objective was completed in June 1995. This paper is aimed at describing the conceptual contents of the final draft of the revision 2 of the 50-C-QA Code 'Quality Assurance for Safety in Nuclear Power Plants and other Nuclear Facilities' (hereinafter, the Code) which is essentially based on performance. Although the performance regulatory approach is not new in Argentina and in other countries, what is indeed novel is applying performance based QA. In such a way the Code will contribute to preventing both QA misinterpretations (i.e., a formalistic regulatory requirement) and the execution of non-effective work without attaining the needed quality level (what may be seen as a pathological deviation of QA). The Code contains ten basic requirements to be adopted when QA programmes are established and implemented in nuclear power plants. The goal is improving safety through an improvement in the methods applied for attaining quality. In line with the current developments in quality management techniques, priority is given to effectiveness of the QA programme. All the involved individuals (that is those in the managerial level, those performing the work and those assessing the work performed) must contribute to quality in a co-ordinated manner. The revised Safety Guides are being introduced, standing out those non existing before. Interrelation between quality assurance, safety culture and quality culture is to be noted. Besides QA for safety-related software mentioned as an issue to be considered by the IAEA. (author)

  12. Error performance analysis in K-tier uplink cellular networks using a stochastic geometric approach

    KAUST Repository

    Afify, Laila H.; Elsawy, Hesham; Al-Naffouri, Tareq Y.; Alouini, Mohamed-Slim

    2015-01-01

    -in-Distribution approach that utilizes stochastic geometric tools to account for the network geometry in the performance characterization. Different from the other stochastic geometry models adopted in the literature, the developed analysis accounts for important

  13. A Comprehensive Approach in Assessing the Performance of an Automobile Closed-Loop Supply Chain

    Directory of Open Access Journals (Sweden)

    Ezutah Udoncy Olugu

    2010-03-01

    Full Text Available The ecological issues arising from manufacturing operations have led to the focus on environmental sustainability in manufacturing. This can be addressed adequately using a closed-loop supply chain (CLSC. To attain an effective and efficient CLSC, it is necessary to imbibe a holistic performance measurement approach. In order to achieve this, there is a need to adopt a specific approach for a particular product rather than being generic. Since sustainability has direct environmental footprints that involve organizational stakeholders, suppliers, customers and the society at large, complexities surrounding supply chain performance measurement have multiplied. In this study, a suitable approach has been proposed for CLSC performance measurement in the automotive industry, based on reviewed literature. It is believed that this approach will result in increased effectiveness and efficiency in CLSC performance measurement.

  14. A High Performance Approach to Minimizing Interactions between Inbound and Outbound Signals in Helmet, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose a high performance approach to enhancing communications between astronauts. In the new generation of NASA audio systems for astronauts, inbound signals...

  15. A general approach toward enhancement of pseudocapacitive performance of conducting polymers by redox-active electrolytes

    KAUST Repository

    Chen, Wei; Xia, Chuan; Baby, Rakhi Raghavan; Alshareef, Husam N.

    2014-01-01

    A general approach is demonstrated where the pseudocapacitive performance of different conducting polymers is enhanced in redox-active electrolytes. The concept is demonstrated using several electroactive conducting polymers, including polyaniline

  16. Method of performing a layer operation in a communications network

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Narlikar, G.J.

    2006-01-01

    In an embodiment of the present invention, a scheduling operation is performed at a lower layer based on upper layer information. In another embodiment of the present invention, an action is performed at an upper layer based on received lower layer information. Also, a scheduling operation may be

  17. Deviation-based spam-filtering method via stochastic approach

    Science.gov (United States)

    Lee, Daekyung; Lee, Mi Jin; Kim, Beom Jun

    2018-03-01

    In the presence of a huge number of possible purchase choices, ranks or ratings of items by others often play very important roles for a buyer to make a final purchase decision. Perfectly objective rating is an impossible task to achieve, and we often use an average rating built on how previous buyers estimated the quality of the product. The problem of using a simple average rating is that it can easily be polluted by careless users whose evaluation of products cannot be trusted, and by malicious spammers who try to bias the rating result on purpose. In this letter we suggest how trustworthiness of individual users can be systematically and quantitatively reflected to build a more reliable rating system. We compute the suitably defined reliability of each user based on the user's rating pattern for all products she evaluated. We call our proposed method as the deviation-based ranking, since the statistical significance of each user's rating pattern with respect to the average rating pattern is the key ingredient. We find that our deviation-based ranking method outperforms existing methods in filtering out careless random evaluators as well as malicious spammers.

  18. Breast cancer tumor classification using LASSO method selection approach

    International Nuclear Information System (INIS)

    Celaya P, J. M.; Ortiz M, J. A.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Garza V, I.; Martinez F, M.; Ortiz R, J. M.

    2016-10-01

    Breast cancer is one of the leading causes of deaths worldwide among women. Early tumor detection is key in reducing breast cancer deaths and screening mammography is the widest available method for early detection. Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. In an attempt to alleviate radiological workload, this work presents a computer-aided diagnosis (CAD x) method aimed to automatically classify tumor lesions into malign or benign as a means to a second opinion. The CAD x methos, extracts image features, and classifies the screening mammogram abnormality into one of two categories: subject at risk of having malignant tumor (malign), and healthy subject (benign). In this study, 143 abnormal segmentation s (57 malign and 86 benign) from the Breast Cancer Digital Repository (BCD R) public database were used to train and evaluate the CAD x system. Percentile-rank (p-rank) was used to standardize the data. Using the LASSO feature selection methodology, the model achieved a Leave-one-out-cross-validation area under the receiver operating characteristic curve (Auc) of 0.950. The proposed method has the potential to rank abnormal lesions with high probability of malignant findings aiding in the detection of potential malign cases as a second opinion to the radiologist. (Author)

  19. Breast cancer tumor classification using LASSO method selection approach

    Energy Technology Data Exchange (ETDEWEB)

    Celaya P, J. M.; Ortiz M, J. A.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Garza V, I.; Martinez F, M.; Ortiz R, J. M., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Av. Ramon Lopez Velarde 801, Col. Centro, 98000 Zacatecas, Zac. (Mexico)

    2016-10-15

    Breast cancer is one of the leading causes of deaths worldwide among women. Early tumor detection is key in reducing breast cancer deaths and screening mammography is the widest available method for early detection. Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. In an attempt to alleviate radiological workload, this work presents a computer-aided diagnosis (CAD x) method aimed to automatically classify tumor lesions into malign or benign as a means to a second opinion. The CAD x methos, extracts image features, and classifies the screening mammogram abnormality into one of two categories: subject at risk of having malignant tumor (malign), and healthy subject (benign). In this study, 143 abnormal segmentation s (57 malign and 86 benign) from the Breast Cancer Digital Repository (BCD R) public database were used to train and evaluate the CAD x system. Percentile-rank (p-rank) was used to standardize the data. Using the LASSO feature selection methodology, the model achieved a Leave-one-out-cross-validation area under the receiver operating characteristic curve (Auc) of 0.950. The proposed method has the potential to rank abnormal lesions with high probability of malignant findings aiding in the detection of potential malign cases as a second opinion to the radiologist. (Author)

  20. Performance evaluation of 2D and 3D deep learning approaches for automatic segmentation of multiple organs on CT images

    Science.gov (United States)

    Zhou, Xiangrong; Yamada, Kazuma; Kojima, Takuya; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2018-02-01

    The purpose of this study is to evaluate and compare the performance of modern deep learning techniques for automatically recognizing and segmenting multiple organ regions on 3D CT images. CT image segmentation is one of the important task in medical image analysis and is still very challenging. Deep learning approaches have demonstrated the capability of scene recognition and semantic segmentation on nature images and have been used to address segmentation problems of medical images. Although several works showed promising results of CT image segmentation by using deep learning approaches, there is no comprehensive evaluation of segmentation performance of the deep learning on segmenting multiple organs on different portions of CT scans. In this paper, we evaluated and compared the segmentation performance of two different deep learning approaches that used 2D- and 3D deep convolutional neural networks (CNN) without- and with a pre-processing step. A conventional approach that presents the state-of-the-art performance of CT image segmentation without deep learning was also used for comparison. A dataset that includes 240 CT images scanned on different portions of human bodies was used for performance evaluation. The maximum number of 17 types of organ regions in each CT scan were segmented automatically and compared to the human annotations by using ratio of intersection over union (IU) as the criterion. The experimental results demonstrated the IUs of the segmentation results had a mean value of 79% and 67% by averaging 17 types of organs that segmented by a 3D- and 2D deep CNN, respectively. All the results of the deep learning approaches showed a better accuracy and robustness than the conventional segmentation method that used probabilistic atlas and graph-cut methods. The effectiveness and the usefulness of deep learning approaches were demonstrated for solving multiple organs segmentation problem on 3D CT images.

  1. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  2. A Formal Methods Approach to the Analysis of Mode Confusion

    Science.gov (United States)

    Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.

    2004-01-01

    The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal

  3. Method to fabricate high performance tubular solid oxide fuel cells

    Science.gov (United States)

    Chen, Fanglin; Yang, Chenghao; Jin, Chao

    2013-06-18

    In accordance with the present disclosure, a method for fabricating a solid oxide fuel cell is described. The method includes forming an asymmetric porous ceramic tube by using a phase inversion process. The method further includes forming an asymmetric porous ceramic layer on a surface of the asymmetric porous ceramic tube by using a phase inversion process. The tube is co-sintered to form a structure having a first porous layer, a second porous layer, and a dense layer positioned therebetween.

  4. An efficient Bayesian inference approach to inverse problems based on an adaptive sparse grid collocation method

    International Nuclear Information System (INIS)

    Ma Xiang; Zabaras, Nicholas

    2009-01-01

    A new approach to modeling inverse problems using a Bayesian inference method is introduced. The Bayesian approach considers the unknown parameters as random variables and seeks the probabilistic distribution of the unknowns. By introducing the concept of the stochastic prior state space to the Bayesian formulation, we reformulate the deterministic forward problem as a stochastic one. The adaptive hierarchical sparse grid collocation (ASGC) method is used for constructing an interpolant to the solution of the forward model in this prior space which is large enough to capture all the variability/uncertainty in the posterior distribution of the unknown parameters. This solution can be considered as a function of the random unknowns and serves as a stochastic surrogate model for the likelihood calculation. Hierarchical Bayesian formulation is used to derive the posterior probability density function (PPDF). The spatial model is represented as a convolution of a smooth kernel and a Markov random field. The state space of the PPDF is explored using Markov chain Monte Carlo algorithms to obtain statistics of the unknowns. The likelihood calculation is performed by directly sampling the approximate stochastic solution obtained through the ASGC method. The technique is assessed on two nonlinear inverse problems: source inversion and permeability estimation in flow through porous media

  5. An approach to a black carbon emission inventory for Mexico by two methods

    International Nuclear Information System (INIS)

    Cruz-Núñez, Xochitl

    2014-01-01

    A black carbon (BC) emission inventory for Mexico is presented. Estimate was performed by using two approaches, based on fuel consumption and emission factors in a top-down scheme, and the second from PM25 emission data and its correlation with black carbon by source category, assuming that black carbon = elemental carbon. Results show that black carbon emissions are in interval 53–473 Gg using the fuel consumption approach and between 62 and 89 using the sector method. Black carbon key sources come from biomass burning in the rural sector, with 47 percent share to the National total. Mobile sources emissions account to 16% to the total. An opportunity to reduce, in the short-term, carbon dioxide equivalent (CO2-eq) emissions by reducing black carbon emissions would be obtained in reducing emissions mainly from biomass burning in rural housing sector and diesel emissions in the transport sector with important co-benefits in direct radiative forcing, public health and air quality. - Highlights: • Black carbon emissions are estimated between 53 and 473 Gg/year on a fuel consumption method. • Black carbon emissions are estimated between 62 and 89 Gg/year on a sector method

  6. An approach to a black carbon emission inventory for Mexico by two methods

    Energy Technology Data Exchange (ETDEWEB)

    Cruz-Núñez, Xochitl, E-mail: xcruz@unam.mx

    2014-05-01

    A black carbon (BC) emission inventory for Mexico is presented. Estimate was performed by using two approaches, based on fuel consumption and emission factors in a top-down scheme, and the second from PM25 emission data and its correlation with black carbon by source category, assuming that black carbon = elemental carbon. Results show that black carbon emissions are in interval 53–473 Gg using the fuel consumption approach and between 62 and 89 using the sector method. Black carbon key sources come from biomass burning in the rural sector, with 47 percent share to the National total. Mobile sources emissions account to 16% to the total. An opportunity to reduce, in the short-term, carbon dioxide equivalent (CO2-eq) emissions by reducing black carbon emissions would be obtained in reducing emissions mainly from biomass burning in rural housing sector and diesel emissions in the transport sector with important co-benefits in direct radiative forcing, public health and air quality. - Highlights: • Black carbon emissions are estimated between 53 and 473 Gg/year on a fuel consumption method. • Black carbon emissions are estimated between 62 and 89 Gg/year on a sector method.

  7. A dynamic approach to real-time performance measurement in design projects

    DEFF Research Database (Denmark)

    Skec, Stanko; Cash, Philip; Storga, Mario

    2017-01-01

    Recent developments in engineering design management point to the need for more dynamic, fine-grain measurement approaches able to deal with multi-dimensional, cross-level process performance in product design. Thus, this paper proposes a new approach to the measurement and management of individu...

  8. The Integral Method, a new approach to quantify bactericidal activity.

    Science.gov (United States)

    Gottardi, Waldemar; Pfleiderer, Jörg; Nagl, Markus

    2015-08-01

    The bactericidal activity (BA) of antimicrobial agents is generally derived from the results of killing assays. A reliable quantitative characterization and particularly a comparison of these substances, however, are impossible with this information. We here propose a new method that takes into account the course of the complete killing curve for assaying BA and that allows a clear-cut quantitative comparison of antimicrobial agents with only one number. The new Integral Method, based on the reciprocal area below the killing curve, reliably calculates an average BA [log10 CFU/min] and, by implementation of the agent's concentration C, the average specific bactericidal activity SBA=BA/C [log10 CFU/min/mM]. Based on experimental killing data, the pertaining BA and SBA values of exemplary active halogen compounds were established, allowing quantitative assertions. N-chlorotaurine (NCT), chloramine T (CAT), monochloramine (NH2Cl), and iodine (I2) showed extremely diverging SBA values of 0.0020±0.0005, 1.11±0.15, 3.49±0.22, and 291±137log10 CFU/min/mM, respectively, against Staphylococcus aureus. This immediately demonstrates an approximately 550-fold stronger activity of CAT, 1730-fold of NH2Cl, and 150,000-fold of I2 compared to NCT. The inferred quantitative assertions and conclusions prove the new method suitable for characterizing bactericidal activity. Its application comprises the effect of defined agents on various bacteria, the consequence of temperature shifts, the influence of varying drug structure, dose-effect relationships, ranking of isosteric agents, comparison of competing commercial antimicrobial formulations, and the effect of additives. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Methods and models in mathematical biology deterministic and stochastic approaches

    CERN Document Server

    Müller, Johannes

    2015-01-01

    This book developed from classes in mathematical biology taught by the authors over several years at the Technische Universität München. The main themes are modeling principles, mathematical principles for the analysis of these models, and model-based analysis of data. The key topics of modern biomathematics are covered: ecology, epidemiology, biochemistry, regulatory networks, neuronal networks, and population genetics. A variety of mathematical methods are introduced, ranging from ordinary and partial differential equations to stochastic graph theory and  branching processes. A special emphasis is placed on the interplay between stochastic and deterministic models.

  10. New Approaches to Aluminum Integral Foam Production with Casting Methods

    Directory of Open Access Journals (Sweden)

    Ahmet Güner

    2015-08-01

    Full Text Available Integral foam has been used in the production of polymer materials for a long time. Metal integral foam casting systems are obtained by transferring and adapting polymer injection technology. Metal integral foam produced by casting has a solid skin at the surface and a foam core. Producing near-net shape reduces production expenses. Insurance companies nowadays want the automotive industry to use metallic foam parts because of their higher impact energy absorption properties. In this paper, manufacturing processes of aluminum integral foam with casting methods will be discussed.

  11. Time interval approach to the pulsed neutron logging method

    International Nuclear Information System (INIS)

    Zhao Jingwu; Su Weining

    1994-01-01

    The time interval of neighbouring neutrons emitted from a steady state neutron source can be treated as that from a time-dependent neutron source. In the rock space, the neutron flux is given by the neutron diffusion equation and is composed of an infinite terms. Each term s composed of two die-away curves. The delay action is discussed and used to measure the time interval with only one detector in the experiment. Nuclear reactions with the time distribution due to different types of radiations observed in the neutron well-logging methods are presented with a view to getting the rock nuclear parameters from the time interval technique

  12. Review of Methods and Approaches for Deriving Numeric ...

    Science.gov (United States)

    EPA will propose numeric criteria for nitrogen/phosphorus pollution to protect estuaries, coastal areas and South Florida inland flowing waters that have been designated Class I, II and III , as well as downstream protective values (DPVs) to protect estuarine and marine waters. In accordance with the formal determination and pursuant to a subsequent consent decree, these numeric criteria are being developed to translate and implement Florida’s existing narrative nutrient criterion, to protect the designated use that Florida has previously set for these waters, at Rule 62-302.530(47)(b), F.A.C. which provides that “In no case shall nutrient concentrations of a body of water be altered so as to cause an imbalance in natural populations of aquatic flora or fauna.” Under the Clean Water Act and EPA’s implementing regulations, these numeric criteria must be based on sound scientific rationale and reflect the best available scientific knowledge. EPA has previously published a series of peer reviewed technical guidance documents to develop numeric criteria to address nitrogen/phosphorus pollution in different water body types. EPA recognizes that available and reliable data sources for use in numeric criteria development vary across estuarine and coastal waters in Florida and flowing waters in South Florida. In addition, scientifically defensible approaches for numeric criteria development have different requirements that must be taken into consider

  13. METHODICAL APPROACHES TO THE CREATION MOOC (EXPERIENCE LAC

    Directory of Open Access Journals (Sweden)

    Larysa Nozdrina

    2016-03-01

    Full Text Available The article presents a number of problems, determining the current state of development of the domestic market of massive open online courses (MOOC. Considering the aim of the research, there is described national experience in this area, proposed and substantiated a number of the criteria and methodological approaches to the implementation of MOOC in higher education. The development of MOOC in the Lviv Academy of Commerce (LAC is reviewed as example. The main factors which determine the success of this educational innovation in the domestic market and are put under the research are software platforms, multimedia software for creating video lectures, course structure and support the learning process. The results of study which have been analyzed in this course can have a positive impact on the functioning of the market MOOC in Ukrainian universities. The article focuses on finding the ways of improving the process of developing and implementing MOOC in higher education in the example of LAC where Web-center on the MOODLE platform is used for e-learning. Further research should focus on the development of institutional mechanisms to ensure the effective design, implementation and operation of the MOOC in the universities in Ukraine. Particular attention during learning in massive open online course should be aimed at improving the educational process and to strengthen student's motivation in MOOC. Experience of MOOC's development in LAC can be useful at creating similar courses at other institutions of higher education in the different platforms, both MOODLE, and on the others

  14. Methodical Approach to Managing Resources at an Industrial Enterprise

    Directory of Open Access Journals (Sweden)

    Antonets Olga O.

    2013-11-01

    Full Text Available The goal of the article lies in identification of optimal ways of managing material resources of an industrial enterprise on the basis of economic and mathematical modelling. In the process of analysis and systematisation of works of foreign and domestic scientists the article makes a conclusion about the insufficient degree of development of such complex solutions on formation of logistic systems of resource management, which would be simple and adaptive. The article provides results of the study of specific features of resource management at enterprises, among which – surplus (deficit of resources and availability of non-liquid reserves. In order to eliminate shortcomings the article offers a situational order of management with consideration of a possible state of reserves. The article improves the model of selection of the volume of supply of material resources and identifies optimal solutions with consideration of interval uncertainty. The further direction of the study lies in integration of the proposed approach to resource management with the system of financial planning at an industrial enterprise.

  15. Organizational Lerning and Strategy: Information Processing Approach of Organizaitonal Learning to Perform Strategic Choice Analysis

    Directory of Open Access Journals (Sweden)

    Agustian Budi Prasetya

    2017-03-01

    Full Text Available Study of organizational learning required to discuss the issue of strategy to understand company’s organizational knowledge and how company applied the organizational knowledge toward the changing of the environment. Method of the analysis for this research was based on desk research thoroughly on the existing literature. This research analyzed the viewpoints of different researchers in organizational learning and elaborates the information processing abilities approach of Organizational Learning (OL. Based on desk research on literature, the research discussed information processing approach to explain organizational learning and strategy choice by describing the importance of information and assumptions, the activities of knowledge acquisition, interpreting and distribution of the knowledge, typology of exploitation and exploration learning. It proposed the importance of the company to perform alignment between internal managerial process arrangement and external environment while doing the learning, based on the strategic choice space, as theatrical clustering map of the learning, the fit, the alignment, and the alliances of the organization. This research finds that the strategic space might help the analysis of balancing between exploitation and exploration learning while applying the analysis of varied firm characteristics, strategic orientation, and industrial environments.

  16. On e-business strategy planning and performance evaluation: An adaptive algorithmic managerial approach

    Directory of Open Access Journals (Sweden)

    Alexandra Lipitakis

    2017-07-01

    Full Text Available A new e-business strategy planning and performance evaluation scheme based on adaptive algorithmic modelling techniques is presented. The effect of financial and non-financial performance of organizations on e-business strategy planning is investigated. The relationships between the four strategic planning parameters are examined, the directions of these relationships are given and six additional basic components are also considered. The new conceptual model has been constructed for e-business strategic planning and performance evaluation and an adaptive algorithmic modelling approach is presented. The new adaptive algorithmic modelling scheme including eleven dynamic modules, can be optimized and used effectively in e-business strategic planning and strategic planning evaluation of various e-services in very large organizations and businesses. A synoptic statistical analysis and comparative numerical results for the case of UK and Greece are given. The proposed e-business models indicate how e-business strategic planning may affect financial and non-financial performance in business and organizations by exploring whether models which are used for strategy planning can be applied to e-business planning and whether these models would be valid in different environments. A conceptual model has been constructed and qualitative research methods have been used for testing a predetermined number of considered hypotheses. The proposed models have been tested in the UK and Greece and the conclusions including numerical results and statistical analyses indicated existing relationships between considered dependent and independent variables. The proposed e-business models are expected to contribute to e-business strategy planning of businesses and organizations and managers should consider applying these models to their e-business strategy planning to improve their companies’ performances. This research study brings together elements of e

  17. Standardless quantification approach of TXRF analysis using fundamental parameter method

    International Nuclear Information System (INIS)

    Szaloki, I.; Taniguchi, K.

    2000-01-01

    New standardless evaluation procedure based on the fundamental parameter method (FPM) has been developed for TXRF analysis. The theoretical calculation describes the relationship between characteristic intensities and the geometrical parameters of the excitation, detection system and the specimen parameters: size, thickness, angle of the excitation beam to the surface and the optical properties of the specimen holder. Most of the TXRF methods apply empirical calibration, which requires the application of special preparation technique. However, the characteristic lines of the specimen holder (Si Kα,β) present information from the local excitation and geometrical conditions on the substrate surface. On the basis of the theoretically calculation of the substrate characteristic intensity the excitation beam flux can be approximated. Taking into consideration the elements are in the specimen material a system of non-linear equation can be given involving the unknown concentration values and the geometrical and detection parameters. In order to solve this mathematical problem PASCAL software was written, which calculates the sample composition and the average sample thickness by gradient algorithm. Therefore, this quantitative estimation of the specimen composition requires neither external nor internal standard sample. For verification of the theoretical calculation and the numerical procedure, several experiments were carried out using mixed standard solution containing elements of K, Sc, V, Mn, Co and Cu in 0.1 - 10 ppm concentration range. (author)

  18. Methods of counting ribs on chest CT: the modified sternomanubrial approach

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Kyung Sik; Kim, Sung Jin; Jeon, Min Hee; Lee, Seung Young; Bae, Il Hun [Chungbuk National University, Cheongju (Korea, Republic of)

    2007-08-15

    The purpose of this study was to evaluate the accuracy of each method of counting ribs on chest CT and to propose a new method: the anterior approach with using the sternocostal joints. CT scans of 38 rib lesions of 27 patients were analyzed (fracture: 25, metastasis: 11, benign bone disease: 2). Each lesion was independently counted by three radiologists with using three different methods for counting ribs: the sternoclavicular approach, the xiphisternal approach and the modified sternomanubrial approach. The rib lesions were divided into three parts of evaluation of each method according to the location of the lesion as follows: the upper part (between the first and fourth thoracic vertebra), the middle part (between the fifth and eighth) and the lower part (between the ninth and twelfth). The most accurate method was a modified sternomanubrial approach (99.1%). The accuracies of a xiphisternal approach and a sternoclavicular approach were 95.6% and 88.6%, respectively. A modified sternomanubrial approach showed the highest accuracies in all three parts (100%, 100% and 97.9%, respectively). We propose a new method for counting ribs, the modified sternomanubrial approach, which was more accurate than the known methods in any parts of the bony thorax, and it may be an easier and quicker method than the others in clinical practice.

  19. An enhanced performance through agent-based secure approach for mobile ad hoc networks

    Science.gov (United States)

    Bisen, Dhananjay; Sharma, Sanjeev

    2018-01-01

    This paper proposes an agent-based secure enhanced performance approach (AB-SEP) for mobile ad hoc network. In this approach, agent nodes are selected through optimal node reliability as a factor. This factor is calculated on the basis of node performance features such as degree difference, normalised distance value, energy level, mobility and optimal hello interval of node. After selection of agent nodes, a procedure of malicious behaviour detection is performed using fuzzy-based secure architecture (FBSA). To evaluate the performance of the proposed approach, comparative analysis is done with conventional schemes using performance parameters such as packet delivery ratio, throughput, total packet forwarding, network overhead, end-to-end delay and percentage of malicious detection.

  20. Performance Measures for Public Participation Methods : Final Report

    Science.gov (United States)

    2018-01-01

    Public engagement is an important part of transportation project development, but measuring its effectiveness is typically piecemealed. Performance measurementdescribed by the Urban Institute as the measurement on a regular basis of the results (o...

  1. Modeling the Performance of Fast Mulipole Method on HPC platforms

    KAUST Repository

    Ibeid, Huda

    2012-01-01

    In this thesis , we discuss the challenges for FMM on current parallel computers and future exasclae architecture. Furthermore, we develop a novel performance model for FMM. Our ultimate aim of this thesis

  2. Instrumental performance of an etude after three methods of practice.

    Science.gov (United States)

    Vanden Ark, S

    1997-12-01

    For 80 fifth-grade students three practice conditions (mental, mental with physical simulation, and physical with singing) produced significant mean differences in instrumental performance of an etude. No significant differences were found for traditional, physical practice.

  3. Energy management in production: A novel method to develop key performance indicators for improving energy efficiency

    International Nuclear Information System (INIS)

    May, Gökan; Barletta, Ilaria; Stahl, Bojan; Taisch, Marco

    2015-01-01

    Highlights: • We propose a 7-step methodology to develop firm-tailored energy-related KPIs (e-KPIs). • We provide a practical guide for companies to identify their most important e-KPIs. • e-KPIs support identification of energy efficiency improvement areas in production. • The method employs an action plan for achieving energy saving targets. • The paper strengthens theoretical base for energy-based decision making in manufacturing. - Abstract: Measuring energy efficiency performance of equipments, processes and factories is the first step to effective energy management in production. Thus, enabled energy-related information allows the assessment of the progress of manufacturing companies toward their energy efficiency goals. In that respect, the study addresses this challenge where current industrial approaches lack the means and appropriate performance indicators to compare energy-use profiles of machines and processes, and for the comparison of their energy efficiency performance to that of competitors’. Focusing on this challenge, the main objective of the paper is to present a method which supports manufacturing companies in the development of energy-based performance indicators. For this purpose, we provide a 7-step method to develop production-tailored and energy-related key performance indicators (e-KPIs). These indicators allow the interpretation of cause-effect relationships and therefore support companies in their operative decision-making process. Consequently, the proposed method supports the identification of weaknesses and areas for energy efficiency improvements related to the management of production and operations. The study therefore aims to strengthen the theoretical base necessary to support energy-based decision making in manufacturing industries

  4. EVALUATION OF SERVICE SUPPLY CHAIN PERFORMANCE CRITERIA WITH DANP METHOD

    OpenAIRE

    ÖZVERİ, Onur; GÜÇLÜ, Pembe; AYCİN, Ejder

    2018-01-01

    Despite the service industry composes large part of the world economy, the academic studies and applications on supply chain are mainly about production industry. Because of the different structure of services, the service supply chain and also performance criteria-metrics differ from the product supply chain. The aim of this paper is to evaluate the supply chain performance metrics for restaurant sector. For this purpose in the first and second part of the paper the service supply chain conc...

  5. Technical methods for a risk-informed, performance-based fire protection program at nuclear power plants

    International Nuclear Information System (INIS)

    Dey, M.K.

    1998-01-01

    This paper presents a technical review and examination of technical methods that are available for developing a risk-informed, performance-based fire protection program at a nuclear plant. The technical methods include ''engineering tools'' for examining the fire dynamics of fire protection problems, reliability techniques for establishing an optimal fire protection surveillance program, fire computer codes for analyzing important fire protection safety parameters, and risk-informed approaches that can range from drawing qualitative insights from risk information to quantifying the risk impact of alternative fire protection approaches. Based on this technical review and examination, it is concluded that methods for modeling fires, and reliability and fire PRA analyses are currently available to support the initial implementation of simple risk-informed, performance-based approaches in fire protection programs. (author)

  6. Technical methods for a risk-informed, performance-based fire protection program at nuclear power plants

    International Nuclear Information System (INIS)

    Dey, M.K.

    2000-01-01

    This paper presents a technical review and examination of technical methods that are available for developing a risk-informed, performance-based fire protection program at a nuclear plant. The technical methods include 'engineering tools' for examining the fire dynamics of fire protection problems, reliability techniques for establishing an optimal fire protection surveillance program, fire computer codes for analyzing important fire protection safety parameters, and risk-informed approaches that can range from drawing qualitative insights from risk information to quantifying the risk impact of alternative fire protection approaches. Based on this technical review and examination, it is concluded that methods for modeling fires, and reliability and fire probabilistic risk analyses (PRA) are currently available to support the initial implementation of simple risk-informed, performance-based approaches in fire protection programs. (orig.) [de

  7. Generalized perturbation theory (GPT) methods. A heuristic approach

    International Nuclear Information System (INIS)

    Gandini, A.

    1987-01-01

    Wigner first proposed a perturbation theory as early as 1945 to study fundamental quantities such as the reactivity worths of different materials. The first formulation, CPT, for conventional perturbation theory is based on universal quantum mechanics concepts. Since that early conception, significant contributions have been made to CPT, in particular, Soodak, who rendered a heuristic interpretation of the adjoint function, (referred to as the GPT method for generalized perturbation theory). The author illustrates the GPT methodology in a variety of linear and nonlinear domains encountered in nuclear reactor analysis. The author begins with the familiar linear neutron field and then generalizes the methodology to other linear and nonlinear fields, using heuristic arguments. The author believes that the inherent simplicity and elegance of the heuristic derivation, although intended here for reactor physics problems might be usefully adopted in collateral fields and includes such examples

  8. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Directory of Open Access Journals (Sweden)

    Ozlem Senvar

    2016-08-01

    Full Text Available Purpose: This study examines Clements’ Approach (CA, Box-Cox transformation (BCT, and Johnson transformation (JT methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI Ppu is handled for process capability analysis (PCA because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD, which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB and the Relative Root Mean Square Error (RRMSE are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations. Practical implications: Weibull distribution covers a wide class of non-normal processes due to its capability to yield a variety of distinct curves based on its parameters. Weibull distributions are known to have significantly different tail behaviors, which greatly affects the process capability. In quality and reliability applications, they are widely used for the analyses of failure data in order to understand how

  9. On summary measure analysis of linear trend repeated measures data: performance comparison with two competing methods.

    Science.gov (United States)

    Vossoughi, Mehrdad; Ayatollahi, S M T; Towhidi, Mina; Ketabchi, Farzaneh

    2012-03-22

    The summary measure approach (SMA) is sometimes the only applicable tool for the analysis of repeated measurements in medical research, especially when the number of measurements is relatively large. This study aimed to describe techniques based on summary measures for the analysis of linear trend repeated measures data and then to compare performances of SMA, linear mixed model (LMM), and unstructured multivariate approach (UMA). Practical guidelines based on the least squares regression slope and mean of response over time for each subject were provided to test time, group, and interaction effects. Through Monte Carlo simulation studies, the efficacy of SMA vs. LMM and traditional UMA, under different types of covariance structures, was illustrated. All the methods were also employed to analyze two real data examples. Based on the simulation and example results, it was found that the SMA completely dominated the traditional UMA and performed convincingly close to the best-fitting LMM in testing all the effects. However, the LMM was not often robust and led to non-sensible results when the covariance structure for errors was misspecified. The results emphasized discarding the UMA which often yielded extremely conservative inferences as to such data. It was shown that summary measure is a simple, safe and powerful approach in which the loss of efficiency compared to the best-fitting LMM was generally negligible. The SMA is recommended as the first choice to reliably analyze the linear trend data with a moderate to large number of measurements and/or small to moderate sample sizes.

  10. A kernel plus method for quantifying wind turbine performance upgrades

    KAUST Repository

    Lee, Giwhyun; Ding, Yu; Xie, Le; Genton, Marc G.

    2014-01-01

    Power curves are commonly estimated using the binning method recommended by the International Electrotechnical Commission, which primarily incorporates wind speed information. When such power curves are used to quantify a turbine's upgrade

  11. Performance of spectral fitting methods for vegetation fluorescence quantification

    NARCIS (Netherlands)

    Meroni, M.; Busetto, D.; Colombo, R.; Guanter, L.; Moreno, J.; Verhoef, W.

    2010-01-01

    The Fraunhofer Line Discriminator (FLD) principle has long been considered as the reference method to quantify solar-induced chlorophyll fluorescence (F) from passive remote sensing measurements. Recently, alternative retrieval algorithms based on the spectral fitting of hyperspectral radiance

  12. Performance of density functional theory methods to describe ...

    Indian Academy of Sciences (India)

    Unknown

    Chemical compounds present different types of isomer- ism. When two isomers differ by ... of DFT methods to describe intramolecular hydrogen shifts. Three small ..... qualitative descriptions of intramolecular hydrogen shifts when large basis ...

  13. Exploring "psychic transparency" during pregnancy: a mixed-methods approach.

    Science.gov (United States)

    Oriol, Cécile; Tordjman, Sylvie; Dayan, Jacques; Poulain, Patrice; Rosenblum, Ouriel; Falissard, Bruno; Dindoyal, Asha; Naudet, Florian

    2016-08-12

    Psychic transparency is described as a psychic crisis occurring during pregnancy. The objective was to test if it was clinically detectable. Seven primiparous and seven nulliparous subjects were recorded during 5 min of spontaneous speech about their dreams. 25 raters from five groups (psychoanalysts, psychiatrists, general practitioners, pregnant women and medical students) listened to the audiotapes. They were asked to rate the probability of the women being pregnant or not. Their ability to discriminate the primiparous women was tested. The probability of being identified correctly or not was calculated for each woman. A qualitative analysis of the speech samples was performed. No group of rater was able to correctly classify pregnant and non-pregnant women. However, the raters' choices were not completely random. The wish to be pregnant or to have a baby could be linked to a primiparous classification whereas job priorities could be linked to a nulliparous classification. It was not possible to detect Psychic transparency in this study. The wish for a child might be easier to identify. In addition, the raters' choices seemed to be connected to social representations of motherhood.

  14. MIMO Terminal Performance Evaluation with a Novel Wireless Cable Method

    DEFF Research Database (Denmark)

    Fan, Wei; Kyösti, Pekka; Hentilä, Lassi

    2017-01-01

    chamber, which might be impractical and expensive. In this paper, a novel wireless cable method is proposed and experimentally validated. By recording the average power (i.e. reference signal received power (RSRP) in the LTE) per DUT antenna port and selecting optimal complex weights at the channel...... emulator output ports, a wireless cable connection can be achieved. The proposed method can be executed in a small RF shielded anechoic box, and offers low system cost, high measurement reliability and repeatability....

  15. Are the new automated methods for bone age estimation advantageous over the manual approaches?

    Science.gov (United States)

    De Sanctis, Vincenzo; Soliman, Ashraf T; Di Maio, Salvatore; Bedair, Said

    2014-12-01

    Bone Age Assessment (BAA) is performed worldwide for the evaluation of endocrine, genetic and chronic diseases, to monitor response to medical therapy and to determine the growth potential of children and adolescents. It is also used for consultation in planning orthopedic procedures, for determination of chronological age for adopted children, youth sports participation and in forensic settings. The main clinical methods for skeletal bone age estimation are the Greulich and Pyle (GP) and the Tanner and Whitehouse (TW) methods. Seventy six per cent (76%) of radiologists or pediatricians usually use the method of GP, 20% that of TW and 4% other methods. The advantages of using the TW method, as opposed to the GP method, are that it overcomes the subjectivity problem and results are more reproducible. However, it is complex and time consuming; for this reason its usage is just about 20% on a world-wide scale. Moreover, there are some evidences that bone age assignments by different physicians can differ significantly. Computerized and Quantitative Ultrasound Technologies (QUS) for assessing skeletal maturity have been developed with the aim of reducing many of the inconsistencies associated with radiographic investigations. In spite of the fact that the volume of automated methods for BAA has increased, the majotity of them are still in an early phase of development. QUS is comparable to the GP based method, but there is not enough established data yet for the healthy population. The Authors wish to stimulate the attention on the accuracy, reliability and consistency of BAA and to initiate a debate on manual versus automated approaches to enhance our assessment for skeletal matutation in children and adolescents.

  16. Comparison of two approaches for establishing performance criteria related to Maintenance Rule

    International Nuclear Information System (INIS)

    Jerng, Dong-Wook; Kim, Man Cheol

    2015-01-01

    Probabilistic safety assessment (PSA) serves as a tool for systemically analyzing the safety of nuclear power plants. This paper explains and compares two approaches for the establishment of performance criteria related to the Maintenance Rule: (1) the individual reliability-based approach, and (2) the PSA importance measure-based approach. Different characteristics of the two approaches were compared in a qualitative manner, while a quantitative comparison was performed through application of the two approaches to a nuclear power plant. It was observed that the individual reliability-based approach resulted in more conservative performance criteria, compared to the PSA importance measure-based approach. It is thus expected that the PSA importance measure-based approach will allow for more flexible maintenance policy under conditions of limited resources, while providing for a macroscopic view of overall plant safety. Based on insights derived through this analysis, we emphasize the importance of a balance between reliability and safety significance, and propose a balance measure accordingly. The conclusions of this analysis are likely to be applicable to other types of nuclear power plants. (author)

  17. The strategic selecting criteria and performance by using the multiple criteria method

    Directory of Open Access Journals (Sweden)

    Lisa Y. Chen

    2008-02-01

    Full Text Available As the increasing competitive intensity in the current service market, organizational capabilities have been recognized as the importance of sustaining competitive advantage. The profitable growth for the firms has been fueled a need to systematically assess and renew the organization. The purpose of this study is to analyze the financial performance of the firms to create an effective evaluating structure for the Taiwan's service industry. This study utilized TOPSIS (technique for order preference by similarity to ideal solution method to evaluate the operating performance of 12 companies. TOPSIS is a multiple criteria decision making method to identify solutions from a finite set of alternatives based upon simultaneous minimization of distance from an ideal point and maximization of distance from a nadir point. By using this approach, this study measures the financial performance of firms through two aspects and ten indicators. The result indicated e-life had outstanding performance among the 12 retailers. The findings of this study provided managers to better understand their market position, competition, and profitability for future strategic planning and operational management.

  18. Towards better environmental performance of wastewater sludge treatment using endpoint approach in LCA methodology

    Directory of Open Access Journals (Sweden)

    Isam Alyaseri

    2017-03-01

    Full Text Available The aim of this study is to use the life cycle assessment method to measure the environmental performance of the sludge incineration process in a wastewater treatment plant and to propose an alternative that can reduce the environmental impact. To show the damages caused by the treatment processes, the study aimed to use an endpoint approach in evaluating impacts on human health, ecosystem quality, and resources due to the processes. A case study was taken at Bissell Point Wastewater Treatment Plant in Saint Louis, Missouri, U.S. The plant-specific data along with literature data from technical publications were used to build an inventory, and then analyzed the environmental burdens from sludge handling unit in the year 2011. The impact assessment method chosen was ReCipe 2008. The existing scenario (dewatering-multiple hearth incineration-ash to landfill was evaluated and three alternative scenarios (fluid bed incineration and anaerobic digestion with and without land application with energy recovery from heat or biogas were proposed and analyzed to find the one with the least environmental impact. The existing scenario shows that the most significant impacts are related to depletion in resources and damage to human health. These impacts mainly came from the operation phase (electricity and fuel consumption and emissions related to combustion. Alternatives showed better performance than the existing scenario. Using ReCipe endpoint methodology, and among the three alternatives tested, the anaerobic digestion had the best overall environmental performance. It is recommended to convert to fluid bed incineration if the concerns were more about human health or to anaerobic digestion if the concerns were more about depletion in resources. The endpoint approach may simplify the outcomes of this study as follows: if the plant is converted to fluid bed incineration, it could prevent an average of 43.2 DALYs in human life, save 0.059 species in the area

  19. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    Science.gov (United States)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    digital soil mapping methods and sets of ancillary variables for producing the most accurate spatial prediction of texture classes in a given area of interest. Both legacy and recently collected data on PSD were used as reference information. The predictor variable data set consisted of digital elevation model and its derivatives, lithology, land use maps as well as various bands and indices of satellite images. Two conceptionally different approaches can be applied in the mapping process. Textural classification can be realized after particle size data were spatially extended by proper geostatistical method. Alternatively, the textural classification is carried out first, followed by the spatial extension through suitable data mining method. According to the first approach, maps of sand, silt and clay percentage have been computed through regression kriging (RK). Since the three maps are compositional (their sum must be 100%), we applied Additive Log-Ratio (alr) transformation, instead of kriging them independently. Finally, the texture class map has been compiled according to the USDA categories from the three maps. Different combinations of reference and training soil data and auxiliary covariables resulted several different maps. On the basis of the other way, the PSD were classified firstly into the USDA categories, then the texture class maps were compiled directly by data mining methods (classification trees and random forests). The various results were compared to each other as well as to the RK maps. The performance of the different methods and data sets has been examined by testing the accuracy of the geostatistically computed and the directly classified results to assess the most predictive and accurate method. Acknowledgement: Our work was supported by the Hungarian National Scientific Research Foundation (OTKA, Grant No. K105167).

  20. Supplier Selection based on the Performance by using PROMETHEE Method

    Science.gov (United States)

    Sinaga, T. S.; Siregar, K.

    2017-03-01

    Generally, companies faced problem to identify vendors that can provide excellent service in availability raw material and on time delivery. The performance of suppliers in a company have to be monitored to ensure the availability to fulfill the company needs. This research is intended to explain how to assess suppliers to improve manufacturing performance. The criteria that considered in evaluating suppliers is criteria of Dickson. There are four main criteria which further split into seven sub-criteria, namely compliance with accuracy, consistency, on-time delivery, right quantity order, flexibility and negotiation, timely of order confirmation, and responsiveness. This research uses PROMETHEE methodology in assessing the supplier performances and obtaining a selected supplier as the best one that shown from the degree of alternative comparison preference between suppliers.

  1. Assessing the stability of free-energy perturbation calculations by performing variations in the method

    Science.gov (United States)

    Manzoni, Francesco; Ryde, Ulf

    2018-03-01

    We have calculated relative binding affinities for eight tetrafluorophenyl-triazole-thiogalactoside inhibitors of galectin-3 with the alchemical free-energy perturbation approach. We obtain a mean absolute deviation from experimental estimates of only 2-3 kJ/mol and a correlation coefficient (R 2) of 0.5-0.8 for seven relative affinities spanning a range of up to 11 kJ/mol. We also studied the effect of using different methods to calculate the charges of the inhibitor and different sizes of the perturbed group (the atoms that are described by soft-core potentials and are allowed to have differing coordinates). However, the various approaches gave rather similar results and it is not possible to point out one approach as consistently and significantly better than the others. Instead, we suggest that such small and reasonable variations in the computational method can be used to check how stable the calculated results are and to obtain a more accurate estimate of the uncertainty than if performing only one calculation with a single computational setup.

  2. SYSTEMATIC PRINCIPLES AND METHODS OF SYMBOLIC APPROACHES IN URBAN DESIGN

    Directory of Open Access Journals (Sweden)

    BULAKH I. V

    2015-12-01

    Full Text Available Formulation of the problem. The low level of expression and personalization of mass architecture of the second half of the twentieth century connected with the spread of industrial technology and even to a greater extent with mechanistic traditionally functional relation to the average person as, abstract consumer architecture. The condition out of the critical situation is focusing on matters aesthetic, artistic understanding and harmonious image creation environment. The problem of increasing architectural and artistic level of architectural and urban planning solutions to overcome the monotony of planning and development, creating aesthetically expressive urban environment does not lose relevance over the past decades. Understanding and acceptance of enigma and dynamic development of cities encourage architects to find new design techniques that are able to provide in the future a reasonable possibility of forming artistic and aesthetic image of the modern city. Purpose. Define and systematize the principles of symbolization architectural and planning images; propose methods symbolism in the architectural planning of image of the urban environment. Conclusion based on analysis of the enhanced concept symbolizing the image of Architecture and Planning, the place, role and symbolization trends at all levels of the urban environment - planning, three-dimensional and improvement of urban areas; first identified the main stages and levels of symbolization (analohyzatsyya, schematization and alehoryzatsiya, their features and characteristics, formulated the basic principles of symbolization architectural and planning of image, namely the principles of communication between figurative analogies, transformation of subsequent circuits, switching allegorical groupings and metamorfizm ultimate goal – symbol birth .

  3. The Methodical Approaches to Activation of Innovative Potential of Enterprise

    Directory of Open Access Journals (Sweden)

    Berveno Oksana V.

    2017-12-01

    Full Text Available The article is aimed at developing theoretical provisions and practical recommendations on methods of management and activation of innovative potential of enterprises. Assessment of innovative potential of enterprise should be carried out from different positions, taking into consideration all external and internal possibilities of enterprise as to carrying out an innovation activity. The system of management of innovation activity and implementation of innovative potential at enterprise should be closely woven in the general management of the enterprise. Activation of innovative potential of enterprise foresees adoption of the whole system of strategic decisions, which are aimed at creation of the most favorable conditions for implementation of innovative potential with obtaining of planned results. The system of activization of innovative potential should develop a number of organizational decisions on interaction of elements of the most innovative potential in the process of innovation activity and cooperation of innovative potential with other subsystems of the enterprise. Results of these organizational decisions in many respects determine efficiency of innovation activity of the enterprise.

  4. Measuring energy performance with sectoral heterogeneity: A non-parametric frontier approach

    International Nuclear Information System (INIS)

    Wang, H.; Ang, B.W.; Wang, Q.W.; Zhou, P.

    2017-01-01

    Evaluating economy-wide energy performance is an integral part of assessing the effectiveness of a country's energy efficiency policy. Non-parametric frontier approach has been widely used by researchers for such a purpose. This paper proposes an extended non-parametric frontier approach to studying economy-wide energy efficiency and productivity performances by accounting for sectoral heterogeneity. Relevant techniques in index number theory are incorporated to quantify the driving forces behind changes in the economy-wide energy productivity index. The proposed approach facilitates flexible modelling of different sectors' production processes, and helps to examine sectors' impact on the aggregate energy performance. A case study of China's economy-wide energy efficiency and productivity performances in its 11th five-year plan period (2006–2010) is presented. It is found that sectoral heterogeneities in terms of energy performance are significant in China. Meanwhile, China's economy-wide energy productivity increased slightly during the study period, mainly driven by the technical efficiency improvement. A number of other findings have also been reported. - Highlights: • We model economy-wide energy performance by considering sectoral heterogeneity. • The proposed approach can identify sectors' impact on the aggregate energy performance. • Obvious sectoral heterogeneities are identified in evaluating China's energy performance.

  5. EEG cross-frequency coupling associated with attentional performance: An RDoC approach to attention

    NARCIS (Netherlands)

    Gerrits, B.J.L.; Vollebregt, M.A.; Olbrich, S.; Kessels, R.P.C.; Palmer, D.; Gordon, E.; Arns, M.W.

    2016-01-01

    19th biennial IPEG Meeting: Nijmegen, The Netherlands. 26-30 October 2016. The quality of attentional performance plays a crucial role in goaldirected behavior in daily life activities, cognitive task performance, and in multiple psychiatric illnesses. The Research Domain Criteria (RDoC) approach

  6. Students’ Approaches to Learning and its Relationship with their Academic Engagement and Qualitative Performance

    Directory of Open Access Journals (Sweden)

    Mohammad Amin Bahrami

    2018-03-01

    Conclusion: Disapproval of the relationship between students’ approaches to learning and their qualitative performance can be attributed to the students’ performance assessment mechanisms. At the same time, due to the heterogeneity in the results of studies in this field, further studies are considered necessary.

  7. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  8. Direct expansion solar assisted heat pumps – A clean steady state approach for overall performance analysis

    International Nuclear Information System (INIS)

    Tagliafico, Luca A.; Scarpa, Federico; Valsuani, Federico

    2014-01-01

    Traditional thermal solar panel technologies have limited efficiency and the required economic investments make them noncompetitive in the space heating market. The greatest limit to the diffusion of thermal solar systems is the characteristic temperatures they can reach: the strong connection between the user temperature and the collector temperature makes it possible to achieve high thermal (collector) efficiency only at low, often useless, user temperatures. By using solar collectors as thermal exchange units (evaporators) in a heat pump system (direct expansion solar assisted heat pump, DX-SAHP), the overall efficiency greatly increases with a significative cut of the associated investment in terms of pay-back time. In this study, an approach is proposed to the steady state analysis of DX-SAHP, which is based on the simplified inverse Carnot cycle and on the second law efficiency concept. This method, without the need of calculating the refrigerant fluid properties and the detailed processes occurring in the refrigeration device, allows us to link the main features of the plant to its relevant interactions with the surroundings. The very nature of the proposed method makes the relationship explicit and meaningful among all the involved variables. The paper, after the description of the method, presents an explanatory application of this technique by reviewing various aspects of the performance of a typical DX-SAHP in which the savings on primary energy consumption is regarded as the main feature of the plant and highlighted in a monthly averaged analysis. Results agree to those coming from a common standard steady state thermodynamic analysis. The application to a typical DX-SAHP system demonstrates that a mean saved primary energy of about 50% with respect to standard gas burner can be achieved for the same user needs. Such a result is almost independent from the type of flat plate solar panel used (double or single glazed, or even bare panels) as a result of

  9. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  10. See me! A Discussion on the Quality in Performing Arts for Children Based on a Performative Approach

    Directory of Open Access Journals (Sweden)

    Lisa Nagel

    2013-12-01

    Full Text Available In this article, the writer discusses and analyses what happens to our evaluation of quality in performing arts for children when we move from the notion of art as an object to art as an event. Erika Fischer-Lichte´s theory on the so-called performative turn in the arts and more specifically, the term the feedback loop, constitutes the article´s theoretical backdrop. Two audience-related episodes, respectively the dance performance BZz BZz-DADA dA bee by ICB Productions (3 - 6 year olds and the theatre performance Thought Lab by Cirka Teater (for 6 year olds and above, serve as starting points for the theoretical discussion. By adopting Siemke Böhnisch’s performative approach to performance analysis, focusing on the terms henvendthet (directed-ness, the actors´ and spectators´ mutual turning to the other and kontakt (connection in relations to the audience, the writer makes it possible to show a dissonance (and its reverse between the performers and the audience in the two respective performances. The term dissonance describes moments of unintended breaks in communication, moments of which the performers are most likely unaware. These moments however become apparent when the audience´s reactions are included in the analysis. The author concludes that by deferring to a performative perspective, we become almost obliged to consider the child audience as qualified judges of quality, as opposed to allowing ourselves to dismiss their interactions as either noise or enthusiasm. Such a perspective is important not only for how we see and evaluate performing arts for children, but also for how artists must think when producing performances for this audience.

  11. New test methods for BIPV. Results from IP performance

    International Nuclear Information System (INIS)

    Jol, J.C.; Van Kampen, B.J.M.; De Boer, B.J.; Reil, F.; Geyer, D.

    2009-11-01

    Within the Performance project new test procedures for PV building products and the building performance as a whole when PV is applied in buildings have been drafted. It has resulted in a first draft of new test procedures for PV building products and proposals for tests for novel BIPV technology like thin film. The test proposed are a module breakage test for BIPV products, a fire safety test for BIPV products and a dynamic load test for BIPV products. Furthermore first proposals of how flexible PV modules could be tested in an appropriate way to ensure long time quality and safety of these new products are presented.

  12. Statistical multi-model approach for performance assessment of cooling tower

    International Nuclear Information System (INIS)

    Pan, Tian-Hong; Shieh, Shyan-Shu; Jang, Shi-Shang; Tseng, Wen-Hung; Wu, Chan-Wei; Ou, Jenq-Jang

    2011-01-01

    This paper presents a data-driven model-based assessment strategy to investigate the performance of a cooling tower. In order to achieve this objective, the operations of a cooling tower are first characterized using a data-driven method, multiple models, which presents a set of local models in the format of linear equations. Satisfactory fuzzy c-mean clustering algorithm is used to classify operating data into several groups to build local models. The developed models are then applied to predict the performance of the system based on design input parameters provided by the manufacturer. The tower characteristics are also investigated using the proposed models via the effects of the water/air flow ratio. The predicted results tend to agree well with the calculated tower characteristics using actual measured operating data from an industrial plant. By comparison with the design characteristic curve provided by the manufacturer, the effectiveness of cooling tower can be obtained in the end. A case study conducted in a commercial plant demonstrates the validity of proposed approach. It should be noted that this is the first attempt to assess the cooling efficiency which is deviated from the original design value using operating data for an industrial scale process. Moreover, the evaluated process need not interrupt the normal operation of the cooling tower. This should be of particular interest in industrial applications.

  13. Scintigraphic acquisition entropy (2). A new approach in the quality control of the scintillation camera performances

    International Nuclear Information System (INIS)

    Elloumi, I.; Bouhdima, M.S.

    2002-01-01

    A new approach in the survey of the performances of gamma camera based on the entropy associated to the scintigraphic acquisition is presented. We take into account the sensitivity, the variation of the collimator response in function of the depth, the uncertainty on the number of counts, the multiplex effect and the spatial uncertainty. This entropy function is expressed in function of all the acquisition parameters: intrinsic crystal resolution, collimator characteristics, emitter object parameters and the source activity. The application of this method to the study of the influence of the collimation shows that the entropy associated to a collimator permits a best appreciation of the quality of the acquisition and therefore a better analysis of collimator performances. Likewise, the evolution of the entropy associated to the acquisition of a uniform source image is in agreement with the variation of the quality of image histogram. One shows, thus, that nor the spatial resolution, nor the sensitivity and nor the signal to noise ratio are able detect a variation of the image quality, when analysed one by one. (author)

  14. Actively Teaching Research Methods with a Process Oriented Guided Inquiry Learning Approach

    Science.gov (United States)

    Mullins, Mary H.

    2017-01-01

    Active learning approaches have shown to improve student learning outcomes and improve the experience of students in the classroom. This article compares a Process Oriented Guided Inquiry Learning style approach to a more traditional teaching method in an undergraduate research methods course. Moving from a more traditional learning environment to…

  15. The Impact of a Multifaceted Approach to Teaching Research Methods on Students' Attitudes

    Science.gov (United States)

    Ciarocco, Natalie J.; Lewandowski, Gary W., Jr.; Van Volkom, Michele

    2013-01-01

    A multifaceted approach to teaching five experimental designs in a research methodology course was tested. Participants included 70 students enrolled in an experimental research methods course in the semester both before and after the implementation of instructional change. When using a multifaceted approach to teaching research methods that…

  16. LUDIC APPROACH OF THE SCIENTIFIC METHOD IN BIOCHEMICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    W.B. Maia

    2008-05-01

    Full Text Available Education in the current times concerns to  the intellectual autonomy, the capacity of decision and the possibilities of the student in making the difference. This work arose due to multi-transdisciplinary perception of an educator’s team from UFPE and UFRJ,  to  whom scientific research means intentional intellectual activity to attend human needs. As an answer to teacher’s questions about student difficulties of learning scientific methodology, it was included a ludic activity into classroom of high school, graduation and postgraduate levels. First of all a CD-ROM entitled: “O Método Científico: Uma Leitura Virtual”  was  made with adobe flash program and it was  used as neurodidactic strategy (auditory and visual stimulus. After thirty minutes of ludic activities, discussions about importance and utility of scientific method took place, and the Science and the Technology as subjects matters arose. So, a democratic and interfacial environment was produced.  As a result of using this didactic  activity it was stimulated curiosity, awakened up attention, developed specific abilities of observation, eased association of ideas and improved capacity of analysis and synthesis in all the groups. Ludic activities in education create possibilities of sociability  (such as consolidation of student’s individuality, and  also  stimulate the interest in the Sciences. In this way, for the development of educational projects focused on paradigms changes that lead to the rise of a new didactic culture,  the use  of diverse interfacial resources becomes necessary.

  17. A new approach to performance assessment of barriers in a repository. Executive summary, draft, technical appendices. Final report

    International Nuclear Information System (INIS)

    Mueller-Hoeppe, N.; Krone, J.; Niehues, N.; Poehler, M.; Raitz von Frentz, R.; Gauglitz, R.

    1999-06-01

    Multi-barrier systems are accepted as the basic approach for long term environmental safe isolation of radioactive waste in geological repositories. Assessing the performance of natural and engineered barriers is one of the major difficulties in producing evidence of environmental safety for any radioactive waste disposal facility, due to the enormous complexity of scenarios and uncertainties to be considered. This report outlines a new methodological approach originally developed basically for a repository in salt, but that can be transferred with minor modifications to any other host rock formation. The approach is based on the integration of following elements: (1) Implementation of a simple method and efficient criteria to assess and prove the tightness of geological and engineered barriers; (2) Using the method of Partial Safety Factors in order to assess barrier performance at certain reasonable level of confidence; (3) Integration of a diverse geochemical barrier in the near field of waste emplacement limiting systematically the radiological consequences from any radionuclide release in safety investigations and (4) Risk based approach for the assessment of radionuclide releases. Indicative calculations performed with extremely conservative assumptions allow to exclude any radiological health consequences from a HLW repository in salt to a reference person with a safety level of 99,9999% per year. (orig.)

  18. Advanced fabrication method for the preparation of MOF thin films: Liquid-phase epitaxy approach meets spin coating method.

    KAUST Repository

    Chernikova, Valeriya; Shekhah, Osama; Eddaoudi, Mohamed

    2016-01-01

    Here we report a new and advanced method for the fabrication of highly oriented/polycrystalline metal-organic framework (MOF) thin films. Building on the attractive features of the liquid-phase epitaxy (LPE) approach, a facile spin coating method

  19. Development of Nuclear Safety Culture evaluation method for an operation team based on the probabilistic approach

    International Nuclear Information System (INIS)

    Han, Sang Min; Lee, Seung Min; Yim, Ho Bin; Seong, Poong Hyun

    2018-01-01

    Highlights: •We proposed a Probabilistic Safety Culture Healthiness Evaluation Method. •Positive relationship between the ‘success’ states of NSC and performance was shown. •The state probability profile showed a unique ratio regardless of the scenarios. •Cutset analysis provided not only root causes but also the latent causes of failures. •Pro-SCHEMe was found to be applicable to Korea NPPs. -- Abstract: The aim of this study is to propose a new quantitative evaluation method for Nuclear Safety Culture (NSC) in Nuclear Power Plant (NPP) operation teams based on the probabilistic approach. Various NSC evaluation methods have been developed, and the Korea NPP utility company has conducted the NSC assessment according to international practice. However, most of methods are conducted by interviews, observations, and the self-assessment. Consequently, the results are often qualitative, subjective, and mainly dependent on evaluator’s judgement, so the assessment results can be interpreted from different perspectives. To resolve limitations of present evaluation methods, the concept of Safety Culture Healthiness was suggested to produce quantitative results and provide faster evaluation process. This paper presents Probabilistic Safety Culture Healthiness Evaluation Method (Pro-SCHEMe) to generate quantitative inputs for Human Reliability Assessment (HRA) in Probabilistic Safety Assessment (PSA). Evaluation items which correspond to a basic event in PSA are derived in the first part of the paper through the literature survey; mostly from nuclear-related organizations such as the International Atomic Energy Agency (IAEA), the United States Nuclear Regulatory Commission (U.S.NRC), and the Institute of Nuclear Power Operations (INPO). Event trees (ETs) and fault trees (FTs) are devised to apply evaluation items to PSA based on the relationships among such items. The Modeling Guidelines are also suggested to classify and calculate NSC characteristics of

  20. Quality initiatives: lean approach to improving performance and efficiency in a radiology department.

    Science.gov (United States)

    Kruskal, Jonathan B; Reedy, Allen; Pascal, Laurie; Rosen, Max P; Boiselle, Phillip M

    2012-01-01

    Many hospital radiology departments are adopting "lean" methods developed in automobile manufacturing to improve operational efficiency, eliminate waste, and optimize the value of their services. The lean approach, which emphasizes process analysis, has particular relevance to radiology departments, which depend on a smooth flow of patients and uninterrupted equipment function for efficient operation. However, the application of lean methods to isolated problems is not likely to improve overall efficiency or to produce a sustained improvement. Instead, the authors recommend a gradual but continuous and comprehensive "lean transformation" of work philosophy and workplace culture. Fundamental principles that must consistently be put into action to achieve such a transformation include equal involvement of and equal respect for all staff members, elimination of waste, standardization of work processes, improvement of flow in all processes, use of visual cues to communicate and inform, and use of specific tools to perform targeted data collection and analysis and to implement and guide change. Many categories of lean tools are available to facilitate these tasks: value stream mapping for visualizing the current state of a process and identifying activities that add no value; root cause analysis for determining the fundamental cause of a problem; team charters for planning, guiding, and communicating about change in a specific process; management dashboards for monitoring real-time developments; and a balanced scorecard for strategic oversight and planning in the areas of finance, customer service, internal operations, and staff development. © RSNA, 2012.