WorldWideScience

Sample records for methods performance approach

  1. Total System Performance Assessment-License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2002-09-13

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issue (KTI) agreements, the ''Yucca Mountain Review Plan'' (CNWRA 2002 [158449]), and 10 CFR Part 63. This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are utilized in this document.

  2. Total System Performance Assessment - License Application Methods and Approach

    International Nuclear Information System (INIS)

    McNeish, J.

    2003-01-01

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document

  3. Total System Performance Assessment - License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2003-12-08

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document.

  4. Performance analysis of demodulation with diversity -- A combinatorial approach I: Symmetric function theoretical methods

    OpenAIRE

    Jean-Louis Dornstetter; Daniel Krob; Jean-Yves Thibon; Ekaterina A. Vassilieva

    2002-01-01

    This paper is devoted to the presentation of a combinatorial approach, based on the theory of symmetric functions, for analyzing the performance of a family of demodulation methods used in mobile telecommunications.

  5. Performance analysis of demodulation with diversity -- A combinatorial approach I: Symmetric function theoretical methods

    Directory of Open Access Journals (Sweden)

    Jean-Louis Dornstetter

    2002-12-01

    Full Text Available This paper is devoted to the presentation of a combinatorial approach, based on the theory of symmetric functions, for analyzing the performance of a family of demodulation methods used in mobile telecommunications.

  6. Technical Efficiency and Organ Transplant Performance: A Mixed-Method Approach

    Science.gov (United States)

    de-Pablos-Heredero, Carmen; Fernández-Renedo, Carlos; Medina-Merodio, Jose-Amelio

    2015-01-01

    Mixed methods research is interesting to understand complex processes. Organ transplants are complex processes in need of improved final performance in times of budgetary restrictions. As the main objective a mixed method approach is used in this article to quantify the technical efficiency and the excellence achieved in organ transplant systems and to prove the influence of organizational structures and internal processes in the observed technical efficiency. The results show that it is possible to implement mechanisms for the measurement of the different components by making use of quantitative and qualitative methodologies. The analysis show a positive relationship between the levels related to the Baldrige indicators and the observed technical efficiency in the donation and transplant units of the 11 analyzed hospitals. Therefore it is possible to conclude that high levels in the Baldrige indexes are a necessary condition to reach an increased level of the service offered. PMID:25950653

  7. Performance and separation occurrence of binary probit regression estimator using maximum likelihood method and Firths approach under different sample size

    Science.gov (United States)

    Lusiana, Evellin Dewi

    2017-12-01

    The parameters of binary probit regression model are commonly estimated by using Maximum Likelihood Estimation (MLE) method. However, MLE method has limitation if the binary data contains separation. Separation is the condition where there are one or several independent variables that exactly grouped the categories in binary response. It will result the estimators of MLE method become non-convergent, so that they cannot be used in modeling. One of the effort to resolve the separation is using Firths approach instead. This research has two aims. First, to identify the chance of separation occurrence in binary probit regression model between MLE method and Firths approach. Second, to compare the performance of binary probit regression model estimator that obtained by MLE method and Firths approach using RMSE criteria. Those are performed using simulation method and under different sample size. The results showed that the chance of separation occurrence in MLE method for small sample size is higher than Firths approach. On the other hand, for larger sample size, the probability decreased and relatively identic between MLE method and Firths approach. Meanwhile, Firths estimators have smaller RMSE than MLEs especially for smaller sample sizes. But for larger sample sizes, the RMSEs are not much different. It means that Firths estimators outperformed MLE estimator.

  8. An applicable approach for performance auditing in ERP

    Directory of Open Access Journals (Sweden)

    Wan Jian Guo

    2016-01-01

    Full Text Available This paper aims at the realistic problem of performance auditing in ERP environment. Traditional performance auditing methods and existing approaches for performance evaluation of ERP implementation could not work well, because they are either difficult to work or contains certain subjective elements. This paper proposed an applicable performance auditing approach for SAP ERP based on quantitative analysis. This approach consists of 3 parts which are system utilization, data quality and the effectiveness of system control. In each part, we provide the main process to conduct the operation, especially how to calculate the online settlement rate of SAP system. This approach has played an important role in the practical auditing work. A practical case is provided at the end of this paper to describe the effectiveness of this approach. Implementation of this approach also has some significance to the performance auditing of other ERP products.

  9. An Investigation into Native and Non-Native Teachers' Judgments of Oral English Performance: A Mixed Methods Approach

    Science.gov (United States)

    Kim, Youn-Hee

    2009-01-01

    This study used a mixed methods research approach to examine how native English-speaking (NS) and non-native English-speaking (NNS) teachers assess students' oral English performance. The evaluation behaviors of two groups of teachers (12 Canadian NS teachers and 12 Korean NNS teachers) were compared with regard to internal consistency, severity,…

  10. Approaches towards airport economic performance measurement

    Directory of Open Access Journals (Sweden)

    Ivana STRYČEKOVÁ

    2011-01-01

    Full Text Available The paper aims to assess how economic benchmarking is being used by airports as a means of performance measurement and comparison of major international airports in the world. The study focuses on current benchmarking practices and methods by taking into account different factors according to which it is efficient to benchmark airports performance. As methods are considered mainly data envelopment analysis and stochastic frontier analysis. Apart from them other approaches are discussed by airports to provide economic benchmarking. The main objective of this article is to evaluate the efficiency of the airports and answer some undetermined questions involving economic benchmarking of the airports.

  11. COMPANY PERFORMANCE MEASUREMENT AND REPORTING METHODS

    Directory of Open Access Journals (Sweden)

    Nicu Ioana Elena

    2012-12-01

    Full Text Available One of the priorities of economic research has been and remains the re-evaluation of the notion of performance and especially exploring and finding some indicators that would reflect as accurately as possible the subtleties of the economic entity. The main purpose of this paper is to highlight the main company performance measurement and reporting methods. Performance is a concept that raises many question marks concerning the most accurate or the best method of reporting the performance at the company level. The research methodology has aimed at studying the Romanian and foreign specialized literature dealing with the analyzed field, studying magazines specialized on company performance measurement. If the financial performance measurement indicators are considered to offer an accurate image of the situation of the company, the modern approach through non-financial indicators offers a new perspective upon performance measurement, which is based on simplicity. In conclusion, after the theoretical study, I have noticed that the methods of performance measurement, reporting and interpretation are various, the opinions regarding the best performance measurement methods are contradictive and the companies prefer resorting to financial indicators that still play a more important role in the consolidation of the company performance measurement than the non-financial indicators do.

  12. Approaches to chronic disease management evaluation in use in Europe: a review of current methods and performance measures.

    Science.gov (United States)

    Conklin, Annalijn; Nolte, Ellen; Vrijhoef, Hubertus

    2013-01-01

    An overview was produced of approaches currently used to evaluate chronic disease management in selected European countries. The study aims to describe the methods and metrics used in Europe as a first to help advance the methodological basis for their assessment. A common template for collection of evaluation methods and performance measures was sent to key informants in twelve European countries; responses were summarized in tables based on template evaluation categories. Extracted data were descriptively analyzed. Approaches to the evaluation of chronic disease management vary widely in objectives, designs, metrics, observation period, and data collection methods. Half of the reported studies used noncontrolled designs. The majority measure clinical process measures, patient behavior and satisfaction, cost and utilization; several also used a range of structural indicators. Effects are usually observed over 1 or 3 years on patient populations with a single, commonly prevalent, chronic disease. There is wide variation within and between European countries on approaches to evaluating chronic disease management in their objectives, designs, indicators, target audiences, and actors involved. This study is the first extensive, international overview of the area reported in the literature.

  13. A Performance Prediction Method for Pumps as Turbines (PAT Using a Computational Fluid Dynamics (CFD Modeling Approach

    Directory of Open Access Journals (Sweden)

    Emma Frosina

    2017-01-01

    Full Text Available Small and micro hydropower systems represent an attractive solution for generating electricity at low cost and with low environmental impact. The pump-as-turbine (PAT approach has promise in this application due to its low purchase and maintenance costs. In this paper, a new method to predict the inverse characteristic of industrial centrifugal pumps is presented. This method is based on results of simulations performed with commercial three-dimensional Computational Fluid Dynamics (CFD software. Model results have been first validated in pumping mode using data supplied by pump manufacturers. Then, the results have been compared to experimental data for a pump running in reverse. Experimentation has been performed on a dedicated test bench installed in the Department of Civil Construction and Environmental Engineering of the University of Naples Federico II. Three different pumps, with different specific speeds, have been analyzed. Using the model results, the inverse characteristic and the best efficiency point have been evaluated. Finally, results have been compared to prediction methods available in the literature.

  14. Weighting Performance Evaluation Criteria Base in Balanced Score Card Approach with Use of Combination Method Shapley value & Bull\\'s-eye

    Directory of Open Access Journals (Sweden)

    Mohammad Hassan Kamfiroozi

    2014-05-01

    Full Text Available Performance evaluation as a control tool was considered by managers in the organizations and manufactures. In this paper we decide to present a new model for performance evaluation and industrial companies ranking at uncertain conditions. Based on this, we implemented performance evaluation based on balance score card (BSC method. Beside, we tried to use three parameter interval grey numbers in lieu of linguistic variables. Then evaluation and weighting of fourth indicators is done with use of Bulls-eye-Shapley combination method that is counted as new approach in this article. Reason of utilization of three parameter interval grey numbers and combination method was decreasing of environmental uncertainty on data and model. This combination weighting method can be used as a new method in decision making Science. At final of this paper case study was implemented on industrial companies (nail makers that ranking of these companies is obtained by use of grey-TOPSIS method (that is a generalization of classic TOPSIS for three parameter interval grey numbers.

  15. Proposal for an Evaluation Method for the Performance of Work Procedures.

    Science.gov (United States)

    Mohammed, Mouda; Mébarek, Djebabra; Wafa, Boulagouas; Makhlouf, Chati

    2016-12-01

    Noncompliance of operators with work procedures is a recurrent problem. This human behavior has been said to be situational and studied by many different approaches (ergonomic and others), which consider the noncompliance with work procedures to be obvious and seek to analyze its causes as well as consequences. The object of the proposed method is to solve this problem by focusing on the performance of work procedures and ensuring improved performance on a continuous basis. This study has multiple results: (1) assessment of the work procedures' performance by a multicriteria approach; (2) the use of a continuous improvement approach as a framework for the sustainability of the assessment method of work procedures' performance; and (3) adaptation of the Stop-Card as a facilitator support for continuous improvement of work procedures. The proposed method emphasizes to put in value the inputs of continuous improvement of the work procedures in relation with the conventional approaches which adopt the obvious evidence of the noncompliance to the working procedures and seek to analyze the cause-effect relationships related to this unacceptable phenomenon, especially in strategic industry.

  16. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach.

    Science.gov (United States)

    Wimmers, Paul F; Fung, Cha-Chi

    2008-06-01

    The finding of case or content specificity in medical problem solving moved the focus of research away from generalisable skills towards the importance of content knowledge. However, controversy about the content dependency of clinical performance and the generalisability of skills remains. This study aimed to explore the relative impact of both perspectives (case specificity and generalisable skills) on different components (history taking, physical examination, communication) of clinical performance within and across cases. Data from a clinical performance examination (CPX) taken by 350 Year 3 students were used in a correlated traits-correlated methods (CTCM) approach using confirmatory factor analysis, whereby 'traits' refers to generalisable skills and 'methods' to individual cases. The baseline CTCM model was analysed and compared with four nested models using structural equation modelling techniques. The CPX consisted of three skills components and five cases. Comparison of the four different models with the least-restricted baseline CTCM model revealed that a model with uncorrelated generalisable skills factors and correlated case-specific knowledge factors represented the data best. The generalisable processes found in history taking, physical examination and communication were responsible for half the explained variance, in comparison with the variance related to case specificity. Conclusions Pure knowledge-based and pure skill-based perspectives on clinical performance both seem too one-dimensional and new evidence supports the idea that a substantial amount of variance contributes to both aspects of performance. It could be concluded that generalisable skills and specialised knowledge go hand in hand: both are essential aspects of clinical performance.

  17. A Gold Standards Approach to Training Instructors to Evaluate Crew Performance

    Science.gov (United States)

    Baker, David P.; Dismukes, R. Key

    2003-01-01

    The Advanced Qualification Program requires that airlines evaluate crew performance in Line Oriented Simulation. For this evaluation to be meaningful, instructors must observe relevant crew behaviors and evaluate those behaviors consistently and accurately against standards established by the airline. The airline industry has largely settled on an approach in which instructors evaluate crew performance on a series of event sets, using standardized grade sheets on which behaviors specific to event set are listed. Typically, new instructors are given a class in which they learn to use the grade sheets and practice evaluating crew performance observed on videotapes. These classes emphasize reliability, providing detailed instruction and practice in scoring so that all instructors within a given class will give similar scores to similar performance. This approach has value but also has important limitations; (1) ratings within one class of new instructors may differ from those of other classes; (2) ratings may not be driven primarily by the specific behaviors on which the company wanted the crews to be scored; and (3) ratings may not be calibrated to company standards for level of performance skill required. In this paper we provide a method to extend the existing method of training instructors to address these three limitations. We call this method the "gold standards" approach because it uses ratings from the company's most experienced instructors as the basis for training rater accuracy. This approach ties the training to the specific behaviors on which the experienced instructors based their ratings.

  18. Enhanced Portfolio Performance Using a Momentum Approach to Annual Rebalancing

    Directory of Open Access Journals (Sweden)

    Michael D. Mattei

    2018-02-01

    Full Text Available After diversification, periodic portfolio rebalancing has become one of the most widely practiced methods for reducing portfolio risk and enhancing returns. Most of the rebalancing strategies found in the literature are generally regarded as contrarian approaches to rebalancing. A recent article proposed a rebalancing approach that incorporates a momentum approach to rebalancing. The momentum approach had a better risk adjusted return than either the traditional approach or a Buy-and-Hold approach. This article identifies an improvement to the momentum approach and then examines the impact of transactions costs and taxes on the portfolio performance of four active rebalancing approaches.

  19. Do Robot Performance and Behavioral Style affect Human Trust? : A Multi-Method Approach

    NARCIS (Netherlands)

    van den Brule, Rik; Dotsch, Ron; Bijlstra, Gijsbert; Wigboldus, D.H.J.; Haselager, Pim

    2014-01-01

    An important aspect of a robot’s social behavior is to convey the right amount of trustworthiness. Task performance has shown to be an important source for trustworthiness judgments. Here, we argue that factors such as a robot’s behavioral style can play an important role as well. Our approach to

  20. Performative Schizoid Method

    DEFF Research Database (Denmark)

    Svabo, Connie

    2016-01-01

    is presented and an example is provided of a first exploratory engagement with it. The method is used in a specific project Becoming Iris, making inquiry into arts-based knowledge creation during a three month visiting scholarship at a small, independent visual art academy. Using the performative schizoid......A performative schizoid method is developed as a method contribution to performance as research. The method is inspired by contemporary research in the human and social sciences urging experimentation and researcher engagement with creative and artistic practice. In the article, the method...... method in Becoming Iris results in four audio-visual and performance-based productions, centered on an emergent theme of the scholartist as a bird in borrowed feathers. Interestingly, the moral lesson of the fable about the vain jackdaw, who dresses in borrowed peacock feathers and becomes a castout...

  1. Approaching direct optimization of as-built lens performance

    Science.gov (United States)

    McGuire, James P.; Kuper, Thomas G.

    2012-10-01

    We describe a method approaching direct optimization of the rms wavefront error of a lens including tolerances. By including the effect of tolerances in the error function, the designer can choose to improve the as-built performance with a fixed set of tolerances and/or reduce the cost of production lenses with looser tolerances. The method relies on the speed of differential tolerance analysis and has recently become practical due to the combination of continuing increases in computer hardware speed and multiple core processing We illustrate the method's use on a Cooke triplet, a double Gauss, and two plastic mobile phone camera lenses.

  2. Differentiating Performance Approach Goals and Their Unique Effects

    Science.gov (United States)

    Edwards, Ordene V.

    2014-01-01

    The study differentiates between two types of performance approach goals (competence demonstration performance approach goal and normative performance approach goal) by examining their unique effects on self-efficacy, interest, and fear of failure. Seventy-nine students completed questionnaires that measure performance approach goals,…

  3. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  4. Methods for implementing Building Information Modeling and Building Performance Simulation approaches

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø

    methodologies. Thesis studies showed that BIM approaches have the potential to improve AEC/FM communication and collaboration. BIM is by its nature multidisciplinary, bringing AEC/FM project participants together and creating constant communication. However, BIM adoption can lead to technical challenges......, Engineering, Construction, and Facility Management (AEC/ FM) communication, and (b) BPS as a platform for early-stage building performance prediction. The second is to develop (a) relevant AEC/FM communication support instruments, and (b) standardized BIM and BPS execution guidelines and information exchange......, for example, getting BIM-compatible tools to communicate properly. Furthermore, BIM adoption requires organizational change, that is changes in AEC/FM work practices and interpersonal dynamics. Consequently, to ensure that the adoption of BIM is successful, it is recommended that common IT regulations...

  5. Cognitive Task Complexity Effects on L2 Writing Performance: An Application of Mixed-Methods Approaches

    Science.gov (United States)

    Abdi Tabari, Mahmoud; Ivey, Toni A.

    2015-01-01

    This paper provides a methodological review of previous research on cognitive task complexity, since the term emerged in 1995, and investigates why much research was more quantitative rather than qualitative. Moreover, it sheds light onto the studies which used the mixed-methods approach and determines which version of the mixed-methods designs…

  6. High-performance parallel approaches for three-dimensional light detection and ranging point clouds gridding

    Science.gov (United States)

    Rizki, Permata Nur Miftahur; Lee, Heezin; Lee, Minsu; Oh, Sangyoon

    2017-01-01

    With the rapid advance of remote sensing technology, the amount of three-dimensional point-cloud data has increased extraordinarily, requiring faster processing in the construction of digital elevation models. There have been several attempts to accelerate the computation using parallel methods; however, little attention has been given to investigating different approaches for selecting the most suited parallel programming model for a given computing environment. We present our findings and insights identified by implementing three popular high-performance parallel approaches (message passing interface, MapReduce, and GPGPU) on time demanding but accurate kriging interpolation. The performances of the approaches are compared by varying the size of the grid and input data. In our empirical experiment, we demonstrate the significant acceleration by all three approaches compared to a C-implemented sequential-processing method. In addition, we also discuss the pros and cons of each method in terms of usability, complexity infrastructure, and platform limitation to give readers a better understanding of utilizing those parallel approaches for gridding purposes.

  7. An integrated approach to validation of safeguards and security program performance

    International Nuclear Information System (INIS)

    Altman, W.D.; Hunt, J.S.; Hockert, J.W.

    1988-01-01

    Department of Energy (DOE) requirements for safeguards and security programs are becoming increasingly performance oriented. Master Safeguards and Security Agreemtns specify performance levels for systems protecting DOE security interests. In order to measure and validate security system performance, Lawrence Livermore National Laboratory (LLNL) has developed cost effective validation tools and a comprehensive validation approach that synthesizes information gained from different activities such as force on force exercises, limited scope performance tests, equipment testing, vulnerability analyses, and computer modeling; into an overall assessment of the performance of the protection system. The analytic approach employs logic diagrams adapted from the fault and event trees used in probabilistic risk assessment. The synthesis of the results from the various validation activities is accomplished using a method developed by LLNL, based upon Bayes' theorem

  8. A multiparameter chaos control method based on OGY approach

    International Nuclear Information System (INIS)

    Souza de Paula, Aline; Amorim Savi, Marcelo

    2009-01-01

    Chaos control is based on the richness of responses of chaotic behavior and may be understood as the use of tiny perturbations for the stabilization of a UPO embedded in a chaotic attractor. Since one of these UPO can provide better performance than others in a particular situation the use of chaos control can make this kind of behavior to be desirable in a variety of applications. The OGY method is a discrete technique that considers small perturbations promoted in the neighborhood of the desired orbit when the trajectory crosses a specific surface, such as a Poincare section. This contribution proposes a multiparameter semi-continuous method based on OGY approach in order to control chaotic behavior. Two different approaches are possible with this method: coupled approach, where all control parameters influences system dynamics although they are not active; and uncoupled approach that is a particular case where control parameters return to the reference value when they become passive parameters. As an application of the general formulation, it is investigated a two-parameter actuation of a nonlinear pendulum control employing coupled and uncoupled approaches. Analyses are carried out considering signals that are generated by numerical integration of the mathematical model using experimentally identified parameters. Results show that the procedure can be a good alternative for chaos control since it provides a more effective UPO stabilization than the classical single-parameter approach.

  9. A statistical approach to nuclear fuel design and performance

    Science.gov (United States)

    Cunning, Travis Andrew

    As CANDU fuel failures can have significant economic and operational consequences on the Canadian nuclear power industry, it is essential that factors impacting fuel performance are adequately understood. Current industrial practice relies on deterministic safety analysis and the highly conservative "limit of operating envelope" approach, where all parameters are assumed to be at their limits simultaneously. This results in a conservative prediction of event consequences with little consideration given to the high quality and precision of current manufacturing processes. This study employs a novel approach to the prediction of CANDU fuel reliability. Probability distributions are fitted to actual fuel manufacturing datasets provided by Cameco Fuel Manufacturing, Inc. They are used to form input for two industry-standard fuel performance codes: ELESTRES for the steady-state case and ELOCA for the transient case---a hypothesized 80% reactor outlet header break loss of coolant accident. Using a Monte Carlo technique for input generation, 105 independent trials are conducted and probability distributions are fitted to key model output quantities. Comparing model output against recognized industrial acceptance criteria, no fuel failures are predicted for either case. Output distributions are well removed from failure limit values, implying that margin exists in current fuel manufacturing and design. To validate the results and attempt to reduce the simulation burden of the methodology, two dimensional reduction methods are assessed. Using just 36 trials, both methods are able to produce output distributions that agree strongly with those obtained via the brute-force Monte Carlo method, often to a relative discrepancy of less than 0.3% when predicting the first statistical moment, and a relative discrepancy of less than 5% when predicting the second statistical moment. In terms of global sensitivity, pellet density proves to have the greatest impact on fuel performance

  10. A Production Approach to Performance of Banks with Microfinance Operations

    OpenAIRE

    Emilyn Cabanda; Eleanor C. Domingo

    2014-01-01

    Banking institutions, nowadays, serve as intermediaries of funds to a variety of clients, including the micro enterprisers. This study analyzes and measures the performance of rural and thrift banks with microfinance operations in the Philippines, using combined measures of data envelopment analysis and traditional financial performance indicators. Data envelopment analysis (DEA) method is employed to measure the productive efficiency of these banks under the production approach. The variable...

  11. Review of scenario selection approaches for performance assessment of high-level waste repositories and related issues

    International Nuclear Information System (INIS)

    Banano, E.J.; Baca, R.G.

    1995-08-01

    The selection of scenarios representing plausible realizations of the future conditions-with associated probabilities of occurrence-that can affect the long-term performance of a high-level radioactive waste (HLW) repository is the commonly used method for treating the uncertainty in the prediction of the future states of the system. This method, conventionally referred to as the ''scenario approach,'' while common is not the only method to deal with this uncertainty; other method ''ch as the environmental simulation approach (ESA), have also been proposed. Two of the difficulties with the scenario approach are the lack of uniqueness in the definition of the term ''scenario'' and the lack of uniqueness in the approach to formulate scenarios, which relies considerably on subjective judgments. Consequently, it is difficult to assure that a complete and unique set of scenarios can be defined for use in a performance assessment. Because scenarios are key to the determination of the long-term performance of the repository system, this lack of uniqueness can present a considerable challenge when attempting to reconcile the set of scenarios, and their level of detail, obtained using different approaches, particularly among proponents and regulators of a HLW repository

  12. Review of scenario selection approaches for performance assessment of high-level waste repositories and related issues.

    Energy Technology Data Exchange (ETDEWEB)

    Banano, E.J. [Beta Corporation International, Albuquerque, NM (United States); Baca, R.G. [Southwest Research Inst., San Antonio, TX (United States). Center for Nuclear Waste Regulatory Analyses

    1995-08-01

    The selection of scenarios representing plausible realizations of the future conditions-with associated probabilities of occurrence-that can affect the long-term performance of a high-level radioactive waste (HLW) repository is the commonly used method for treating the uncertainty in the prediction of the future states of the system. This method, conventionally referred to as the ``scenario approach,`` while common is not the only method to deal with this uncertainty; other method ``ch as the environmental simulation approach (ESA), have also been proposed. Two of the difficulties with the scenario approach are the lack of uniqueness in the definition of the term ``scenario`` and the lack of uniqueness in the approach to formulate scenarios, which relies considerably on subjective judgments. Consequently, it is difficult to assure that a complete and unique set of scenarios can be defined for use in a performance assessment. Because scenarios are key to the determination of the long-term performance of the repository system, this lack of uniqueness can present a considerable challenge when attempting to reconcile the set of scenarios, and their level of detail, obtained using different approaches, particularly among proponents and regulators of a HLW repository.

  13. The use of mixed-methods research to diagnose the organisational performance of a local government

    Directory of Open Access Journals (Sweden)

    Benjamin H. Olivier

    2017-07-01

    Full Text Available Orientation: The majority of local governments in South Africa are underperforming; a first step to improve their performance is to accurately diagnose their current functioning. The utilisation of a mixed-methods approach for this diagnosis based on a valid model of organisational performance will form a better and holistic understanding of how a local government is performing. Research purpose: The aim of this study is to investigate the utility of mixed-methods research as a diagnostic approach for determining the organisational performance of a local government in South Africa. Motivation for the study: The use of either quantitative or qualitative data gathering in isolation as part of an organisational diagnosis can lead to biased information and not identifying the root causes of problems. The use of mixed-methods research in which both quantitative and qualitative data gathering methods are utilised has been shown to produce numerous benefits, such as confirmation of gathered data, providing richer detail and initiating new lines of thinking. Such multiple methodologies are recognised as an essential component of any organisational diagnosis and can be an effective means of eliminating biases in singular data gathering methods. Research design, approach and method: A concurrent transformative mixed-methods strategy based on the Burke–Litwin model of organisational performance with triangulation of results and findings to determine convergence validity was used. A convenience sample of 116 (N = 203 permanent officials in a rural district municipality in South Africa completed a survey questionnaire and were also individually interviewed. Main findings: Results indicate that mixed-methods research is a valid technique for establishing the integrity of survey data and for providing a better and holistic understanding of the functioning of an organisation. The results also indicate that the Burke–Litwin model is a useful and valid

  14. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    Science.gov (United States)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  15. Fuzzy Logic Approach to Diagnosis of Feedwater Heater Performance Degradation

    International Nuclear Information System (INIS)

    Kang, Yeon Kwan; Kim, Hyeon Min; Heo, Gyun Young; Sang, Seok Yoon

    2014-01-01

    Since failure in, damage to, and performance degradation of power generation components in operation under harsh environment of high pressure and high temperature may cause both economic and human loss at power plants, highly reliable operation and control of these components are necessary. Therefore, a systematic method of diagnosing the condition of these components in its early stages is required. There have been many researches related to the diagnosis of these components, but our group developed an approach using a regression model and diagnosis table, specializing in diagnosis relating to thermal efficiency degradation of power plant. However, there was a difficulty in applying the method using the regression model to power plants with different operating conditions because the model was sensitive to value. In case of the method that uses diagnosis table, it was difficult to find the level at which each performance degradation factor had an effect on the components. Therefore, fuzzy logic was introduced in order to diagnose performance degradation using both qualitative and quantitative results obtained from the components' operation data. The model makes performance degradation assessment using various performance degradation variables according to the input rule constructed based on fuzzy logic. The purpose of the model is to help the operator diagnose performance degradation of components of power plants. This paper makes an analysis of power plant feedwater heater by using fuzzy logic. Feedwater heater is one of the core components that regulate life-cycle of a power plant. Performance degradation has a direct effect on power generation efficiency. It is not easy to observe performance degradation of feedwater heater. However, on the other hand, troubles such as tube leakage may bring simultaneous damage to the tube bundle and therefore it is the object of concern in economic aspect. This study explains the process of diagnosing and verifying typical

  16. Fuzzy Logic Approach to Diagnosis of Feedwater Heater Performance Degradation

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Yeon Kwan; Kim, Hyeon Min; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Sang, Seok Yoon [Engineering and Technical Center, Korea Hydro, Daejeon (Korea, Republic of)

    2014-08-15

    Since failure in, damage to, and performance degradation of power generation components in operation under harsh environment of high pressure and high temperature may cause both economic and human loss at power plants, highly reliable operation and control of these components are necessary. Therefore, a systematic method of diagnosing the condition of these components in its early stages is required. There have been many researches related to the diagnosis of these components, but our group developed an approach using a regression model and diagnosis table, specializing in diagnosis relating to thermal efficiency degradation of power plant. However, there was a difficulty in applying the method using the regression model to power plants with different operating conditions because the model was sensitive to value. In case of the method that uses diagnosis table, it was difficult to find the level at which each performance degradation factor had an effect on the components. Therefore, fuzzy logic was introduced in order to diagnose performance degradation using both qualitative and quantitative results obtained from the components' operation data. The model makes performance degradation assessment using various performance degradation variables according to the input rule constructed based on fuzzy logic. The purpose of the model is to help the operator diagnose performance degradation of components of power plants. This paper makes an analysis of power plant feedwater heater by using fuzzy logic. Feedwater heater is one of the core components that regulate life-cycle of a power plant. Performance degradation has a direct effect on power generation efficiency. It is not easy to observe performance degradation of feedwater heater. However, on the other hand, troubles such as tube leakage may bring simultaneous damage to the tube bundle and therefore it is the object of concern in economic aspect. This study explains the process of diagnosing and verifying typical

  17. A general method for assessing brain-computer interface performance and its limitations

    Science.gov (United States)

    Hill, N. Jeremy; Häuser, Ann-Katrin; Schalk, Gerwin

    2014-04-01

    Objective. When researchers evaluate brain-computer interface (BCI) systems, we want quantitative answers to questions such as: How good is the system’s performance? How good does it need to be? and: Is it capable of reaching the desired level in future? In response to the current lack of objective, quantitative, study-independent approaches, we introduce methods that help to address such questions. We identified three challenges: (I) the need for efficient measurement techniques that adapt rapidly and reliably to capture a wide range of performance levels; (II) the need to express results in a way that allows comparison between similar but non-identical tasks; (III) the need to measure the extent to which certain components of a BCI system (e.g. the signal processing pipeline) not only support BCI performance, but also potentially restrict the maximum level it can reach. Approach. For challenge (I), we developed an automatic staircase method that adjusted task difficulty adaptively along a single abstract axis. For challenge (II), we used the rate of information gain between two Bernoulli distributions: one reflecting the observed success rate, the other reflecting chance performance estimated by a matched random-walk method. This measure includes Wolpaw’s information transfer rate as a special case, but addresses the latter’s limitations including its restriction to item-selection tasks. To validate our approach and address challenge (III), we compared four healthy subjects’ performance using an EEG-based BCI, a ‘Direct Controller’ (a high-performance hardware input device), and a ‘Pseudo-BCI Controller’ (the same input device, but with control signals processed by the BCI signal processing pipeline). Main results. Our results confirm the repeatability and validity of our measures, and indicate that our BCI signal processing pipeline reduced attainable performance by about 33% (21 bits min-1). Significance. Our approach provides a flexible basis

  18. Comparison of two methods to determine fan performance curves using computational fluid dynamics

    Science.gov (United States)

    Onma, Patinya; Chantrasmi, Tonkid

    2018-01-01

    This work investigates a systematic numerical approach that employs Computational Fluid Dynamics (CFD) to obtain performance curves of a backward-curved centrifugal fan. Generating the performance curves requires a number of three-dimensional simulations with varying system loads at a fixed rotational speed. Two methods were used and their results compared to experimental data. The first method incrementally changes the mass flow late through the inlet boundary condition while the second method utilizes a series of meshes representing the physical damper blade at various angles. The generated performance curves from both methods are compared with an experiment setup in accordance with the AMCA fan performance testing standard.

  19. A hybrid approach for efficient anomaly detection using metaheuristic methods

    Directory of Open Access Journals (Sweden)

    Tamer F. Ghanem

    2015-07-01

    Full Text Available Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  20. Using hybrid method to evaluate the green performance in uncertainty.

    Science.gov (United States)

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed.

  1. A Simulation Approach for Performance Validation during Embedded Systems Design

    Science.gov (United States)

    Wang, Zhonglei; Haberl, Wolfgang; Herkersdorf, Andreas; Wechs, Martin

    Due to the time-to-market pressure, it is highly desirable to design hardware and software of embedded systems in parallel. However, hardware and software are developed mostly using very different methods, so that performance evaluation and validation of the whole system is not an easy task. In this paper, we propose a simulation approach to bridge the gap between model-driven software development and simulation based hardware design, by merging hardware and software models into a SystemC based simulation environment. An automated procedure has been established to generate software simulation models from formal models, while the hardware design is originally modeled in SystemC. As the simulation models are annotated with timing information, performance issues are tackled in the same pass as system functionality, rather than in a dedicated approach.

  2. Business Intelligence Approach In A Business Performance Context

    OpenAIRE

    Muntean, Mihaela; Cabau, Liviu Gabriel

    2011-01-01

    Subordinated to performance management, Business Intelligence approaches help firms to optimize business performance. Key performance indicators will be added to the multidimensional model grounding the performance perspectives. With respect to the Business Intelligence value chain, a theoretical approach was introduced and a practice example, based on Microsoft SQL Server specific services, for the customer perspective was implemented.

  3. Performance analysis, quality function deployment and structured methods

    Science.gov (United States)

    Maier, M. W.

    Quality function deployment, (QFD), an approach to synthesizing several elements of system modeling and design into a single unit, is presented. Behavioral, physical, and performance modeling are usually considered as separate aspects of system design without explicit linkages. Structured methodologies have developed linkages between behavioral and physical models before, but have not considered the integration of performance models. QFD integrates performance models with traditional structured models. In this method, performance requirements such as cost, weight, and detection range are partitioned into matrices. Partitioning is done by developing a performance model, preferably quantitative, for each requirement. The parameters of the model become the engineering objectives in a QFD analysis and the models are embedded in a spreadsheet version of the traditional QFD matrices. The performance model and its parameters are used to derive part of the functional model by recognizing that a given performance model implies some structure to the functionality of the system.

  4. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  5. Approaches to Mixed Methods Dissemination and Implementation Research: Methods, Strengths, Caveats, and Opportunities.

    Science.gov (United States)

    Green, Carla A; Duan, Naihua; Gibbons, Robert D; Hoagwood, Kimberly E; Palinkas, Lawrence A; Wisdom, Jennifer P

    2015-09-01

    Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings.

  6. Compression-RSA: New approach of encryption and decryption method

    Science.gov (United States)

    Hung, Chang Ee; Mandangan, Arif

    2013-04-01

    Rivest-Shamir-Adleman (RSA) cryptosystem is a well known asymmetric cryptosystem and it has been applied in a very wide area. Many researches with different approaches have been carried out in order to improve the security and performance of RSA cryptosystem. The enhancement of the performance of RSA cryptosystem is our main interest. In this paper, we propose a new method to increase the efficiency of RSA by shortening the number of plaintext before it goes under encryption process without affecting the original content of the plaintext. Concept of simple Continued Fraction and the new special relationship between it and Euclidean Algorithm have been applied on this newly proposed method. By reducing the number of plaintext-ciphertext, the encryption-decryption processes of a secret message can be accelerated.

  7. Corporate Social Responsibility and Financial Performance: A Two Least Regression Approach

    Directory of Open Access Journals (Sweden)

    Alexander Olawumi Dabor

    2017-12-01

    Full Text Available The objective of this study is to investigate the casuality between corporate social responsibility and firm financial performance. The study employed two least square regression approaches. Fifty-two firms were selected using the scientific method. The findings revealed that corporate social responsibility and firm performance in manufacturing sector are mutually related at 5%. The study recommended that management of manufacturing companies in Nigeria should expend on CSR to boost profitability and corporate image.

  8. Information operator approach and iterative regularization methods for atmospheric remote sensing

    International Nuclear Information System (INIS)

    Doicu, A.; Hilgers, S.; Bargen, A. von; Rozanov, A.; Eichmann, K.-U.; Savigny, C. von; Burrows, J.P.

    2007-01-01

    In this study, we present the main features of the information operator approach for solving linear inverse problems arising in atmospheric remote sensing. This method is superior to the stochastic version of the Tikhonov regularization (or the optimal estimation method) due to its capability to filter out the noise-dominated components of the solution generated by an inappropriate choice of the regularization parameter. We extend this approach to iterative methods for nonlinear ill-posed problems and derive the truncated versions of the Gauss-Newton and Levenberg-Marquardt methods. Although the paper mostly focuses on discussing the mathematical details of the inverse method, retrieval results have been provided, which exemplify the performances of the methods. These results correspond to the NO 2 retrieval from SCIAMACHY limb scatter measurements and have been obtained by using the retrieval processors developed at the German Aerospace Center Oberpfaffenhofen and Institute of Environmental Physics of the University of Bremen

  9. Adjusted permutation method for multiple attribute decision making with meta-heuristic solution approaches

    Directory of Open Access Journals (Sweden)

    Hossein Karimi

    2011-04-01

    Full Text Available The permutation method of multiple attribute decision making has two significant deficiencies: high computational time and wrong priority output in some problem instances. In this paper, a novel permutation method called adjusted permutation method (APM is proposed to compensate deficiencies of conventional permutation method. We propose Tabu search (TS and particle swarm optimization (PSO to find suitable solutions at a reasonable computational time for large problem instances. The proposed method is examined using some numerical examples to evaluate the performance of the proposed method. The preliminary results show that both approaches provide competent solutions in relatively reasonable amounts of time while TS performs better to solve APM.

  10. Performance assessment plans and methods for the Salt Repository Project

    International Nuclear Information System (INIS)

    1984-08-01

    This document presents the preliminary plans and anticipated methods of the Salt Repository Project (SRP) for assessing the postclosure and radiological aspects of preclosure performance of a nuclear waste repository in salt. This plan is intended to be revised on an annual basis. The emphasis in this preliminary effort is on the method of conceptually dividing the system into three subsystems (the very near field, the near field, and the far field) and applying models to analyze the behavior of each subsystem and its individual components. The next revision will contain more detailed plans being developed as part of Site Characterization Plan (SCP) activities. After a brief system description, this plan presents the performance targets which have been established for nuclear waste repositories by regulatory agencies (Chapter 3). The SRP approach to modeling, including sensitivity and uncertainty techniques is then presented (Chapter 4). This is followed by a discussion of scenario analysis (Chapter 5), a presentation of preliminary data needs as anticipated by the SRP (Chapter 6), and a presentation of the SRP approach to postclosure assessment of the very near field, the near field, and the far field (Chapters 7, 8, and 9, respectively). Preclosure radiological assessment is discussed in Chapter 10. Chapter 11 presents the SRP approach to code verification and validation. Finally, the Appendix lists all computer codes anticipated for use in performance assessments. The list of codes will be updated as plans are revised

  11. Methods to stimulate national and sub-national benchmarking through international health system performance comparisons: a Canadian approach.

    Science.gov (United States)

    Veillard, Jeremy; Moses McKeag, Alexandra; Tipper, Brenda; Krylova, Olga; Reason, Ben

    2013-09-01

    This paper presents, discusses and evaluates methods used by the Canadian Institute for Health Information to present health system performance international comparisons in ways that facilitate their understanding by the public and health system policy-makers and can stimulate performance benchmarking. We used statistical techniques to normalize the results and present them on a standardized scale facilitating understanding of results. We compared results to the OECD average, and to benchmarks. We also applied various data quality rules to ensure the validity of results. In order to evaluate the impact of the public release of these results, we used quantitative and qualitative methods and documented other types of impact. We were able to present results for performance indicators and dimensions at national and sub-national levels; develop performance profiles for each Canadian province; and show pan-Canadian performance patterns for specific performance indicators. The results attracted significant media attention at national level and reactions from various stakeholders. Other impacts such as requests for additional analysis and improvement in data timeliness were observed. The methods used seemed attractive to various audiences in the Canadian context and achieved the objectives originally defined. These methods could be refined and applied in different contexts. Copyright © 2013 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Approach to performance based regulation development

    International Nuclear Information System (INIS)

    Spogen, L.R.; Cleland, L.L.

    1977-06-01

    An approach to the development of performance based regulations (PBR's) is described. Initially, a framework is constructed that consists of a function hierarchy and associated measures. The function at the top of the hierarchy is described in terms of societal objectives. Decomposition of this function into subordinate functions and their subsequent decompositions yield the function hierarchy. ''Bottom'' functions describe the roles of system components. When measures are identified for the performance of each function and means of aggregating performances to higher levels are established, the framework may be employed for developing PBR's. Consideration of system flexibility and performance uncertainty guide in determining the hierarchical level at which regulations are formulated. Ease of testing compliance is also a factor. To show the viability of the approach, the framework developed by Lawrence Livermore Laboratory for the Nuclear Regulatory Commission for evaluation of material control systems at fixed facilities is presented

  13. Technical methods for a risk-informed, performance-based fire protection program at nuclear power plants

    International Nuclear Information System (INIS)

    Dey, M.K.

    1998-01-01

    This paper presents a technical review and examination of technical methods that are available for developing a risk-informed, performance-based fire protection program at a nuclear plant. The technical methods include ''engineering tools'' for examining the fire dynamics of fire protection problems, reliability techniques for establishing an optimal fire protection surveillance program, fire computer codes for analyzing important fire protection safety parameters, and risk-informed approaches that can range from drawing qualitative insights from risk information to quantifying the risk impact of alternative fire protection approaches. Based on this technical review and examination, it is concluded that methods for modeling fires, and reliability and fire PRA analyses are currently available to support the initial implementation of simple risk-informed, performance-based approaches in fire protection programs. (author)

  14. Technical methods for a risk-informed, performance-based fire protection program at nuclear power plants

    International Nuclear Information System (INIS)

    Dey, M.K.

    2000-01-01

    This paper presents a technical review and examination of technical methods that are available for developing a risk-informed, performance-based fire protection program at a nuclear plant. The technical methods include 'engineering tools' for examining the fire dynamics of fire protection problems, reliability techniques for establishing an optimal fire protection surveillance program, fire computer codes for analyzing important fire protection safety parameters, and risk-informed approaches that can range from drawing qualitative insights from risk information to quantifying the risk impact of alternative fire protection approaches. Based on this technical review and examination, it is concluded that methods for modeling fires, and reliability and fire probabilistic risk analyses (PRA) are currently available to support the initial implementation of simple risk-informed, performance-based approaches in fire protection programs. (orig.) [de

  15. A Shot Number Based Approach to Performance Analysis in Table Tennis

    Directory of Open Access Journals (Sweden)

    Tamaki Sho

    2017-01-01

    Full Text Available The current study proposes a novel approach that improves the conventional performance analysis in table tennis by introducing the concept of frequency, or the number of shots, of each shot number. The improvements over the conventional method are as follows: better accuracy of the evaluation of skills and tactics of players, additional insights into scoring and returning skills and ease of understanding the results with a single criterion. The performance analysis of matches played at the 2012 Summer Olympics in London was conducted using the proposed method. The results showed some effects of the shot number and gender differences in table tennis. Furthermore, comparisons were made between Chinese players and players from other countries, what threw light on the skills and tactics of the Chinese players. The present findings demonstrate that the proposed method provides useful information and has some advantages over the conventional method.

  16. Evaluation of micronozzle performance through DSMC, navier-stokes and coupled dsmc/navier-stokes approaches

    NARCIS (Netherlands)

    Torre, F. la; Kenjeres, S.; Kleijn, C.R.; Moerel, J.L.P.A.

    2009-01-01

    Both the particle based Direct Simulation Monte Carlo (DSMC) method and a compressible Navier-Stokes based continuum method are used to investigate the flow inside micronozzles and to predict the performance of such devices. For the Navier-Stokes approach, both slip and no-slip boundary conditions

  17. Experimental Study Comparing a Traditional Approach to Performance Appraisal Training to a Whole-Brain Training Method at C.B. Fleet Laboratories

    Science.gov (United States)

    Selden, Sally; Sherrier, Tom; Wooters, Robert

    2012-01-01

    The purpose of this study is to examine the effects of a new approach to performance appraisal training. Motivated by split-brain theory and existing studies of cognitive information processing and performance appraisals, this exploratory study examined the effects of a whole-brain approach to training managers for implementing performance…

  18. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  19. An Integrated MCDM Method in Ranking BSC Perspectives and key Performance Indicators (KPIs

    Directory of Open Access Journals (Sweden)

    Mohsen Alvandi

    2012-04-01

    Full Text Available The balanced scorecard (BSC approach is an effective technique for performance evaluation. BSC can better reflect the dependence and feedback problems of each factor in real world situations. This study aims at developing a set of appropriate key performance indicators according to (BSC approach for SAPCO using multiple criteria decision making(MCDM method. We provide key performance indicators through literature reviews and experts' idea in SAPCO, which is one of the biggest vehicle spare suppliers in Iran. The proposed study uses decision making trial and evaluation laboratory (DEMATEL and analytic network process (ANP, respectively to measure the casual relationship between the perspectives as well as the relative weights. The results based on ANP method shows that ‘‘Customer’’ is the most influential factor. In addition, internal process, financial and learning and growth are in two to four positions. Three important key performance indicators are as bellow: Total price of parts, Customer satisfaction and Lack of parts in production.

  20. Accounting Student's Learning Approaches And Impact On Academic Performance

    OpenAIRE

    Ismail, Suhaiza

    2009-01-01

    The objective of the study is threefold. Firstly, the study explores the learning approaches adopted by students in completing their Business Finance. Secondly, it examines the impact that learning approaches has on the student's academic performance. Finally, the study considers gender differences in the learning approaches adopted by students and in the relationship between learning approaches and academic performance. The Approaches and Study Skills Inventory for Students (ASSIST) was used...

  1. The large break LOCA evaluation method with the simplified statistic approach

    International Nuclear Information System (INIS)

    Kamata, Shinya; Kubo, Kazuo

    2004-01-01

    USNRC published the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology to large break LOCA which supported the revised rule for Emergency Core Cooling System performance in 1989. In USNRC regulatory guide 1.157, it is required that the peak cladding temperature (PCT) cannot exceed 2200deg F with high probability 95th percentile. In recent years, overseas countries have developed statistical methodology and best estimate code with the model which can provide more realistic simulation for the phenomena based on the CSAU evaluation methodology. In order to calculate PCT probability distribution by Monte Carlo trials, there are approaches such as the response surface technique using polynomials, the order statistics method, etc. For the purpose of performing rational statistic analysis, Mitsubishi Heavy Industries, LTD (MHI) tried to develop the statistic LOCA method using the best estimate LOCA code MCOBRA/TRAC and the simplified code HOTSPOT. HOTSPOT is a Monte Carlo heat conduction solver to evaluate the uncertainties of the significant fuel parameters at the PCT positions of the hot rod. The direct uncertainty sensitivity studies can be performed without the response surface because the Monte Carlo simulation for key parameters can be performed in short time using HOTSPOT. With regard to the parameter uncertainties, MHI established the treatment that the bounding conditions are given for LOCA boundary and plant initial conditions, the Monte Carlo simulation using HOTSPOT is applied to the significant fuel parameters. The paper describes the large break LOCA evaluation method with the simplified statistic approach and the results of the application of the method to the representative four-loop nuclear power plant. (author)

  2. A Control Variate Method for Probabilistic Performance Assessment. Improved Estimates for Mean Performance Quantities of Interest

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, Robert J.; Kuhlman, Kristopher L

    2016-05-01

    We present a method of control variates for calculating improved estimates for mean performance quantities of interest, E(PQI) , computed from Monte Carlo probabilistic simulations. An example of a PQI is the concentration of a contaminant at a particular location in a problem domain computed from simulations of transport in porous media. To simplify the presentation, the method is described in the setting of a one- dimensional elliptical model problem involving a single uncertain parameter represented by a probability distribution. The approach can be easily implemented for more complex problems involving multiple uncertain parameters and in particular for application to probabilistic performance assessment of deep geologic nuclear waste repository systems. Numerical results indicate the method can produce estimates of E(PQI)having superior accuracy on coarser meshes and reduce the required number of simulations needed to achieve an acceptable estimate.

  3. Composite Measures of Health Care Provider Performance: A Description of Approaches

    Science.gov (United States)

    Shwartz, Michael; Restuccia, Joseph D; Rosen, Amy K

    2015-01-01

    Context Since the Institute of Medicine’s 2001 report Crossing the Quality Chasm, there has been a rapid proliferation of quality measures used in quality-monitoring, provider-profiling, and pay-for-performance (P4P) programs. Although individual performance measures are useful for identifying specific processes and outcomes for improvement and tracking progress, they do not easily provide an accessible overview of performance. Composite measures aggregate individual performance measures into a summary score. By reducing the amount of data that must be processed, they facilitate (1) benchmarking of an organization’s performance, encouraging quality improvement initiatives to match performance against high-performing organizations, and (2) profiling and P4P programs based on an organization’s overall performance. Methods We describe different approaches to creating composite measures, discuss their advantages and disadvantages, and provide examples of their use. Findings The major issues in creating composite measures are (1) whether to aggregate measures at the patient level through all-or-none approaches or the facility level, using one of the several possible weighting schemes; (2) when combining measures on different scales, how to rescale measures (using z scores, range percentages, ranks, or 5-star categorizations); and (3) whether to use shrinkage estimators, which increase precision by smoothing rates from smaller facilities but also decrease transparency. Conclusions Because provider rankings and rewards under P4P programs may be sensitive to both context and the data, careful analysis is warranted before deciding to implement a particular method. A better understanding of both when and where to use composite measures and the incentives created by composite measures are likely to be important areas of research as the use of composite measures grows. PMID:26626986

  4. A Simulation Modeling Approach Method Focused on the Refrigerated Warehouses Using Design of Experiment

    Science.gov (United States)

    Cho, G. S.

    2017-09-01

    For performance optimization of Refrigerated Warehouses, design parameters are selected based on the physical parameters such as number of equipment and aisles, speeds of forklift for ease of modification. This paper provides a comprehensive framework approach for the system design of Refrigerated Warehouses. We propose a modeling approach which aims at the simulation optimization so as to meet required design specifications using the Design of Experiment (DOE) and analyze a simulation model using integrated aspect-oriented modeling approach (i-AOMA). As a result, this suggested method can evaluate the performance of a variety of Refrigerated Warehouses operations.

  5. Approaches to chronic disease management evaluation in use in Europe : A review of current methods and performance measures

    NARCIS (Netherlands)

    Conklin, A.; Nolte, E.; Vrijhoef, H.J.M.

    2013-01-01

    Objectives: An overview was produced of approaches currently used to evaluate chronic disease management in selected European countries. The study aims to describe the methods and metrics used in Europe as a first to help advance the methodological basis for their assessment. Methods: A common

  6. Maintenance Approaches for Different Production Methods

    Directory of Open Access Journals (Sweden)

    Mungani, Dzivhuluwani Simon

    2013-11-01

    Full Text Available Various production methods are used in industry to manufacture or produce a variety of products needed by industry and consumers. The nature of a product determines which production method is most suitable or cost-effective. A continuous process is typically used to produce large volumes of liquids or gases. Batch processing is often used for small volumes, such as pharmaceutical products. This paper discusses a research project to determine the relationship between maintenance approaches and production methods. A survey was done to determine to what extent three maintenance approaches reliability-centred maintenance (RCM, total productive maintenance (TPM, and business-centred maintenance (BCM are used for three different processing methods (continuous process, batch process, and a production line method.

  7. ACCOUNTING STUDENT’S LEARNING APPROACHES AND IMPACT ON ACADEMIC PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Suhaiza Ismail

    2009-12-01

    Full Text Available The objective of the study is threefold. Firstly, the study explores the learning approaches adopted by students in completing their Business Finance. Secondly, it examines the impact that learning approaches has on the student’s academic performance. Finally, the study considers gender differences in the learning approaches adopted by students and in the relationship between learning approaches and academic performance. The Approaches and Study Skills Inventory for Students (ASSIST was used to assess the approaches to learning adopted by students whilst the students final examination result was considered in examining the performance of the students. The results indicate that majority of the accounting students, both male andfemale groups prefer to use the deep approach in studying Business Finance. The findings also reveal that there were significant relationships between learning approaches and academic performance with positive direction appears for deep and strategic approaches whilst negative relationship reveals for surface approach.

  8. A comparative study of reinitialization approaches of the level set method for simulating free-surface flows

    Energy Technology Data Exchange (ETDEWEB)

    Sufyan, Muhammad; Ngo, Long Cu; Choi, Hyoung Gwon [Seoul National University, Seoul (Korea, Republic of)

    2016-04-15

    Unstructured grids were used to compare the performance of a direct reinitialization scheme with those of two reinitialization approaches based on the solution of a hyperbolic Partial differential equation (PDE). The problems of moving interface were solved in the context of a finite element method. A least-square weighted residual method was used to discretize the advection equation of the level set method. The benchmark problems of rotating Zalesak's disk, time-reversed single vortex, and two-dimensional sloshing were examined. Numerical results showed that the direct reinitialization scheme performed better than the PDE-based reinitialization approaches in terms of mass conservation, dissipation and dispersion error, and computational time. In the case of sloshing, numerical results were found to be in good agreement with existing experimental data. The direct reinitialization approach consumed considerably less CPU time than the PDE-based simulations for 20 time periods of sloshing. This approach was stable, accurate, and efficient for all the problems considered in this study.

  9. Applying a social network analysis (SNA) approach to understanding radiologists' performance in reading mammograms

    Science.gov (United States)

    Tavakoli Taba, Seyedamir; Hossain, Liaquat; Heard, Robert; Brennan, Patrick; Lee, Warwick; Lewis, Sarah

    2017-03-01

    Rationale and objectives: Observer performance has been widely studied through examining the characteristics of individuals. Applying a systems perspective, while understanding of the system's output, requires a study of the interactions between observers. This research explains a mixed methods approach to applying a social network analysis (SNA), together with a more traditional approach of examining personal/ individual characteristics in understanding observer performance in mammography. Materials and Methods: Using social networks theories and measures in order to understand observer performance, we designed a social networks survey instrument for collecting personal and network data about observers involved in mammography performance studies. We present the results of a study by our group where 31 Australian breast radiologists originally reviewed 60 mammographic cases (comprising of 20 abnormal and 40 normal cases) and then completed an online questionnaire about their social networks and personal characteristics. A jackknife free response operating characteristic (JAFROC) method was used to measure performance of radiologists. JAFROC was tested against various personal and network measures to verify the theoretical model. Results: The results from this study suggest a strong association between social networks and observer performance for Australian radiologists. Network factors accounted for 48% of variance in observer performance, in comparison to 15.5% for the personal characteristics for this study group. Conclusion: This study suggest a strong new direction for research into improving observer performance. Future studies in observer performance should consider social networks' influence as part of their research paradigm, with equal or greater vigour than traditional constructs of personal characteristics.

  10. Efficacy of a Template Creation Approach for Performance Improvement

    Science.gov (United States)

    Lyons, Paul R.

    2011-01-01

    This article presents the training and performance improvement approach, performance templates (P-T), and provides empirical evidence to support the efficacy of P-T. This approach involves a partnership among managers, trainers, and employees in the creation, use, and improvement of guides to affect the performance of critical tasks in the…

  11. A hollow sphere soft lithography approach for long-term hanging drop methods.

    Science.gov (United States)

    Lee, Won Gu; Ortmann, Daniel; Hancock, Matthew J; Bae, Hojae; Khademhosseini, Ali

    2010-04-01

    In conventional hanging drop (HD) methods, embryonic stem cell aggregates or embryoid bodies (EBs) are often maintained in small inverted droplets. Gravity limits the volumes of these droplets to less than 50 microL, and hence such cell cultures can only be sustained for a few days without frequent media changes. Here we present a new approach to performing long-term HD methods (10-15 days) that can provide larger media reservoirs in a HD format to maintain more consistent culture media conditions. To implement this approach, we fabricated hollow sphere (HS) structures by injecting liquid drops into noncured poly(dimethylsiloxane) mixtures. These structures served as cell culture chambers with large media volumes (500 microL in each sphere) where EBs could grow without media depletion. The results showed that the sizes of the EBs cultured in the HS structures in a long-term HD format were approximately twice those of conventional HD methods after 10 days in culture. Further, HS cultures showed multilineage differentiation, similar to EBs cultured in the HD method. Due to its ease of fabrication and enhanced features, this approach may be of potential benefit as a stem cell culture method for regenerative medicine.

  12. Performance of non-conventional factorization approaches for neutron kinetics

    International Nuclear Information System (INIS)

    Bulla, S.; Nervo, M.

    2013-01-01

    The use of factorization techniques provides a interesting option for the simulation of the time-dependent behavior of nuclear systems with a reduced computational effort. While point kinetics neglects all spatial and spectral effects, quasi-statics and multipoint kinetics allow to produce results with a higher accuracy for transients involving relevant modifications of the neutron distribution. However, in some conditions these methods can not work efficiently. In this paper, we discuss some possible alternative formulations for the factorization process for neutron kinetics, leading to mathematical models of reduced complications that can allow an accurate simulation of transients involving spatial and spectral effects. The performance of these innovative approaches are compared to standard techniques for some test cases, showing the benefits and shortcomings of the method proposed. (authors)

  13. Learning approaches as predictors of academic performance in first year health and science students.

    Science.gov (United States)

    Salamonson, Yenna; Weaver, Roslyn; Chang, Sungwon; Koch, Jane; Bhathal, Ragbir; Khoo, Cheang; Wilson, Ian

    2013-07-01

    To compare health and science students' demographic characteristics and learning approaches across different disciplines, and to examine the relationship between learning approaches and academic performance. While there is increasing recognition of a need to foster learning approaches that improve the quality of student learning, little is known about students' learning approaches across different disciplines, and their relationships with academic performance. Prospective, correlational design. Using a survey design, a total of 919 first year health and science students studying in a university located in the western region of Sydney from the following disciplines were recruited to participate in the study - i) Nursing: n = 476, ii) Engineering: n = 75, iii) Medicine: n = 77, iv) Health Sciences: n = 204, and v) Medicinal Chemistry: n = 87. Although there was no statistically significant difference in the use of surface learning among the five discipline groups, there were wide variations in the use of deep learning approach. Furthermore, older students and those with English as an additional language were more likely to use deep learning approach. Controlling for hours spent in paid work during term-time and English language usage, both surface learning approach (β = -0.13, p = 0.001) and deep learning approach (β = 0.11, p = 0.009) emerged as independent and significant predictors of academic performance. Findings from this study provide further empirical evidence that underscore the importance for faculty to use teaching methods that foster deep instead of surface learning approaches, to improve the quality of student learning and academic performance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Damage approach: A new method for topology optimization with local stress constraints

    DEFF Research Database (Denmark)

    Verbart, Alexander; Langelaar, Matthijs; van Keulen, Fred

    2016-01-01

    In this paper, we propose a new method for topology optimization with local stress constraints. In this method, material in which a stress constraint is violated is considered as damaged. Since damaged material will contribute less to the overall performance of the structure, the optimizer...... will promote a design with a minimal amount of damaged material. We tested the method on several benchmark problems, and the results show that the method is a viable alternative for conventional stress-based approaches based on constraint relaxation followed by constraint aggregation....

  15. A Pattern-Oriented Approach to a Methodical Evaluation of Modeling Methods

    Directory of Open Access Journals (Sweden)

    Michael Amberg

    1996-11-01

    Full Text Available The paper describes a pattern-oriented approach to evaluate modeling methods and to compare various methods with each other from a methodical viewpoint. A specific set of principles (the patterns is defined by investigating the notations and the documentation of comparable modeling methods. Each principle helps to examine some parts of the methods from a specific point of view. All principles together lead to an overall picture of the method under examination. First the core ("method neutral" meaning of each principle is described. Then the methods are examined regarding the principle. Afterwards the method specific interpretations are compared with each other and with the core meaning of the principle. By this procedure, the strengths and weaknesses of modeling methods regarding methodical aspects are identified. The principles are described uniformly using a principle description template according to descriptions of object oriented design patterns. The approach is demonstrated by evaluating a business process modeling method.

  16. Uncertainty evaluation methods for waste package performance assessment

    International Nuclear Information System (INIS)

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  17. Non-animal methods to predict skin sensitization (II): an assessment of defined approaches *.

    Science.gov (United States)

    Kleinstreuer, Nicole C; Hoffmann, Sebastian; Alépée, Nathalie; Allen, David; Ashikaga, Takao; Casey, Warren; Clouet, Elodie; Cluzel, Magalie; Desprez, Bertrand; Gellatly, Nichola; Göbel, Carsten; Kern, Petra S; Klaric, Martina; Kühnl, Jochen; Martinozzi-Teissier, Silvia; Mewes, Karsten; Miyazawa, Masaaki; Strickland, Judy; van Vliet, Erwin; Zang, Qingda; Petersohn, Dirk

    2018-05-01

    Skin sensitization is a toxicity endpoint of widespread concern, for which the mechanistic understanding and concurrent necessity for non-animal testing approaches have evolved to a critical juncture, with many available options for predicting sensitization without using animals. Cosmetics Europe and the National Toxicology Program Interagency Center for the Evaluation of Alternative Toxicological Methods collaborated to analyze the performance of multiple non-animal data integration approaches for the skin sensitization safety assessment of cosmetics ingredients. The Cosmetics Europe Skin Tolerance Task Force (STTF) collected and generated data on 128 substances in multiple in vitro and in chemico skin sensitization assays selected based on a systematic assessment by the STTF. These assays, together with certain in silico predictions, are key components of various non-animal testing strategies that have been submitted to the Organization for Economic Cooperation and Development as case studies for skin sensitization. Curated murine local lymph node assay (LLNA) and human skin sensitization data were used to evaluate the performance of six defined approaches, comprising eight non-animal testing strategies, for both hazard and potency characterization. Defined approaches examined included consensus methods, artificial neural networks, support vector machine models, Bayesian networks, and decision trees, most of which were reproduced using open source software tools. Multiple non-animal testing strategies incorporating in vitro, in chemico, and in silico inputs demonstrated equivalent or superior performance to the LLNA when compared to both animal and human data for skin sensitization.

  18. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  19. ACCOUNTING STUDENT’S LEARNING APPROACHES AND IMPACT ON ACADEMIC PERFORMANCE

    OpenAIRE

    Suhaiza Ismail

    2009-01-01

    The objective of the study is threefold. Firstly, the study explores the learning approaches adopted by students in completing their Business Finance. Secondly, it examines the impact that learning approaches has on the student’s academic performance. Finally, the study considers gender differences in the learning approaches adopted by students and in the relationship between learning approaches and academic performance. The Approaches and Study Skills Inventory for Students (ASSIST) was used...

  20. Evaluating firms' R&D performance using best worst method.

    Science.gov (United States)

    Salimi, Negin; Rezaei, Jafar

    2018-02-01

    Since research and development (R&D) is the most critical determinant of the productivity, growth and competitive advantage of firms, measuring R&D performance has become the core of attention of R&D managers, and an extensive body of literature has examined and identified different R&D measurements and determinants of R&D performance. However, measuring R&D performance and assigning the same level of importance to different R&D measures, which is the common approach in existing studies, can oversimplify the R&D measuring process, which may result in misinterpretation of the performance and consequently fallacy R&D strategies. The aim of this study is to measure R&D performance taking into account the different levels of importance of R&D measures, using a multi-criteria decision-making method called Best Worst Method (BWM) to identify the weights (importance) of R&D measures and measure the R&D performance of 50 high-tech SMEs in the Netherlands using the data gathered in a survey among SMEs and from R&D experts. The results show how assigning different weights to different R&D measures (in contrast to simple mean) results in a different ranking of the firms and allow R&D managers to formulate more effective strategies to improve their firm's R&D performance by applying knowledge regarding the importance of different R&D measures. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Performance-approach and performance-avoidance classroom goals and the adoption of personal achievement goals.

    Science.gov (United States)

    Schwinger, Malte; Stiensmeier-Pelster, Joachim

    2011-12-01

    Students' perceptions of classroom goals influence their adoption of personal goals. To assess different forms of classroom goals, recent studies have favoured an overall measure of performance classroom goals, compared to a two-dimensional assessment of performance-approach and performance-avoidance classroom goals (PAVCG). This paper considered the relationship between students' perceptions of classroom goals and their endorsement of personal achievement goals. We proposed that three (instead of only two) classroom goals need to be distinguished. We aimed to provide evidence for this hypothesis by confirmatory factor analysis (CFA) and also by divergent associations between the respective classroom goal and students' personal goal endorsement. A total of 871 (474 female) 10th grade students from several German high schools participated in this study. Students responded to items assessing their perception of mastery, performance-approach, and performance-avoidance goals in the classroom. Additionally, the students reported how much they personally pursue mastery, performance-approach, and performance-avoidance goals. All items referred to German as a specific school subject. RESULTS.A CFA yielded empirical support for the proposed distinction of three (instead of only two) different kinds of classroom goals. Moreover, in hierarchical linear modelling (HLM) analyses all three classroom goals showed unique associations with students' personal goal adoption. The findings emphasized the need to distinguish performance-approach and PAVCG. Furthermore, our results suggest that multiple classroom goals have interactive effects on students' personal achievement strivings. ©2010 The British Psychological Society.

  2. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Energy Technology Data Exchange (ETDEWEB)

    Senvar, O.; Sennaroglu, B.

    2016-07-01

    This study examines Clements’ Approach (CA), Box-Cox transformation (BCT), and Johnson transformation (JT) methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI) Ppu is handled for process capability analysis (PCA) because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD), which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB) and the Relative Root Mean Square Error (RRMSE) are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations... (Author)

  3. A perturbative approach for enhancing the performance of time series forecasting.

    Science.gov (United States)

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Methodological approach to strategic performance optimization

    OpenAIRE

    Hell, Marko; Vidačić, Stjepan; Garača, Željko

    2009-01-01

    This paper presents a matrix approach to the measuring and optimization of organizational strategic performance. The proposed model is based on the matrix presentation of strategic performance, which follows the theoretical notions of the balanced scorecard (BSC) and strategy map methodologies, initially developed by Kaplan and Norton. Development of a quantitative record of strategic objectives provides an arena for the application of linear programming (LP), which is a mathematical tech...

  5. Evaluating Method Engineer Performance: an error classification and preliminary empirical study

    Directory of Open Access Journals (Sweden)

    Steven Kelly

    1998-11-01

    Full Text Available We describe an approach to empirically test the use of metaCASE environments to model methods. Both diagrams and matrices have been proposed as a means for presenting the methods. These different paradigms may have their own effects on how easily and well users can model methods. We extend Batra's classification of errors in data modelling to cover metamodelling, and use it to measure the performance of a group of metamodellers using either diagrams or matrices. The tentative results from this pilot study confirm the usefulness of the classification, and show some interesting differences between the paradigms.

  6. Energy performance of three Airtight Drywall Approach houses

    Energy Technology Data Exchange (ETDEWEB)

    Howell, D.G.; Mayhew, W.J.

    1987-03-01

    The objective of this study was to assess a new constructon technique, the Airtight Drywall Approach (ADA), as it was implemented in three test houses, and to compare the performance of these houses against three control houses typical of residential construction techniques in Alberta. The study focussed on four aspects of house performance integrity of the air barrier system, energy conservation. ventilation, and indoor air quality, and the development and demonstration of computer-based field monitoring techniques. Data were gathered through continuous computer-based measurements, regular site visits, manual measurements, homeowner interviews, and special site tests. The results of the air-leakage tests indicated that ADA is an effective method of reducing air infiltration in homes. The floor joist sealing technique used in the ADA houses was observed to deteriorate within a year of construction. It is no longer recommended. The monitoring results showed a significant reduction in energy consumption in the homes with energy conservation features. Measurements of air-borne contaminants indicated that the ADA test homes performed similar to other energy-efficient homes monitored across Canada and that pollutant levels were within accepted guidelines. 6 refs., 6 figs., 14 tabs.

  7. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    International Nuclear Information System (INIS)

    Shafii, Mohammad Ali; Meidianti, Rahma; Wildian,; Fitriyani, Dian; Tongkukut, Seni H. J.; Arkundato, Artoto

    2014-01-01

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation

  8. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    Energy Technology Data Exchange (ETDEWEB)

    Shafii, Mohammad Ali, E-mail: mashafii@fmipa.unand.ac.id; Meidianti, Rahma, E-mail: mashafii@fmipa.unand.ac.id; Wildian,, E-mail: mashafii@fmipa.unand.ac.id; Fitriyani, Dian, E-mail: mashafii@fmipa.unand.ac.id [Department of Physics, Andalas University Padang West Sumatera Indonesia (Indonesia); Tongkukut, Seni H. J. [Department of Physics, Sam Ratulangi University Manado North Sulawesi Indonesia (Indonesia); Arkundato, Artoto [Department of Physics, Jember University Jember East Java Indonesia (Indonesia)

    2014-09-30

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation.

  9. Methodological approach to organizational performance improvement process

    OpenAIRE

    Buble, Marin; Dulčić, Želimir; Pavić, Ivan

    2017-01-01

    Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  10. Application of Grey-TOPSIS approach to evaluate value chain performance of tea processing chains

    Directory of Open Access Journals (Sweden)

    Richard Nyaoga

    2016-03-01

    Full Text Available This study develops an effective method to measure value chain performance and rank them based on qualitative criteria and to determine the ranking order of the various forms of performance under study. This approach integrates the advantage of grey systems theory and TOPSIS to evaluate and rank value chain performance. Grey-TOPSIS approach has been applied to measure and rank the value chain performance of various firms. The results indicate that the proposed model is useful to facilitate multi-criteria decision-making (MCDM problem under the environment of uncertainty and vagueness. The model also provides an appropriate ranking order based on the available alternatives. The Grey-TOPSIS approach that will be useful to the managers to use for solving the similar type of decision-making problems in their firms in the future has been discussed. Even though, the problem of choosing a suitable performance option is often addressed in practice and research, very few studies are available in the literature of Grey-TOPSIS decision models. Also, Grey-TOPSIS model application in the tea processing firms is non-existence hence this study is the very first to apply this model in evaluating value chain performance in the tea processing firms.

  11. A State-Based Modeling Approach for Efficient Performance Evaluation of Embedded System Architectures at Transaction Level

    Directory of Open Access Journals (Sweden)

    Anthony Barreteau

    2012-01-01

    Full Text Available Abstract models are necessary to assist system architects in the evaluation process of hardware/software architectures and to cope with the still increasing complexity of embedded systems. Efficient methods are required to create reliable models of system architectures and to allow early performance evaluation and fast exploration of the design space. In this paper, we present a specific transaction level modeling approach for performance evaluation of hardware/software architectures. This approach relies on a generic execution model that exhibits light modeling effort. Created models are used to evaluate by simulation expected processing and memory resources according to various architectures. The proposed execution model relies on a specific computation method defined to improve the simulation speed of transaction level models. The benefits of the proposed approach are highlighted through two case studies. The first case study is a didactic example illustrating the modeling approach. In this example, a simulation speed-up by a factor of 7,62 is achieved by using the proposed computation method. The second case study concerns the analysis of a communication receiver supporting part of the physical layer of the LTE protocol. In this case study, architecture exploration is led in order to improve the allocation of processing functions.

  12. Methodological approach to organizational performance improvement process

    Directory of Open Access Journals (Sweden)

    Marin Buble

    2001-01-01

    Full Text Available Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  13. Common genetic variants associated with cognitive performance identified using the proxy-phenotype method

    NARCIS (Netherlands)

    C.A. Rietveld (Niels); T. Esko (Tõnu); G. Davies (Gail); T.H. Pers (Tune); P. Turley (Patrick); B. Benyamin (Beben); C.F. Chabris (Christopher F.); V. Emilsson (Valur); A.D. Johnson (Andrew); J.J. Lee (James J.); C. de Leeuw (Christiaan); R.E. Marioni (Riccardo); S.E. Medland (Sarah Elizabeth); M. Miller (Mike); O. Rostapshova (Olga); S.J. van der Lee (Sven); A.A.E. Vinkhuyzen (Anna A.); N. Amin (Najaf); D. Conley (Dalton); J. Derringer; C.M. van Duijn (Cornelia); R.S.N. Fehrmann (Rudolf); L. Franke (Lude); E.L. Glaeser (Edward L.); N.K. Hansell (Narelle); C. Hayward (Caroline); W.G. Iacono (William); C.A. Ibrahim-Verbaas (Carla); V.W.V. Jaddoe (Vincent); J. Karjalainen (Juha); D. Laibson (David); P. Lichtenstein (Paul); D.C. Liewald (David C.); P.K. Magnusson (Patrik); N.G. Martin (Nicholas); M. McGue (Matt); G. Mcmahon (George); N.L. Pedersen (Nancy); S. Pinker (Steven); D.J. Porteous (David J.); D. Posthuma (Danielle); F. Rivadeneira Ramirez (Fernando); B.H. Smithk (Blair H.); J.M. Starr (John); H.W. Tiemeier (Henning); N.J. Timpsonm (Nicholas J.); M. Trzaskowskin (Maciej); A.G. Uitterlinden (André); F.C. Verhulst (Frank); M.E. Ward (Mary); M.J. Wright (Margaret); G.D. Smith; I.J. Deary (Ian J.); M. Johannesson (Magnus); R. Plomin (Robert); P.M. Visscher (Peter); D.J. Benjamin (Daniel J.); D. Cesarini (David); Ph.D. Koellinger (Philipp)

    2014-01-01

    textabstractWe identify common genetic variants associated with cognitive performance using a two-stage approach, which we call the proxyphenotype method. First, we conduct a genome-wide association study of educational attainment in a large sample (n = 106,736), which produces a set of 69

  14. Investigation on the performance of bridge approach slab

    Directory of Open Access Journals (Sweden)

    Abdelrahman Amr

    2018-01-01

    Full Text Available In Egypt, where highway bridges are to be constructed on soft cohesive soils, the bridge abutments are usually founded on rigid piles, whereas the earth embankments for the bridge approaches are directly founded on the natural soft ground. Consequently, excessive differential settlement frequently occurs between the bridge deck and the bridge approaches resulting in a “bump” at both ends of the bridge deck. Such a bump not only creates a rough and uncomfortable ride but also represents a hazardous condition to traffic. One effective technique to cope with the bump problem is to use a reinforced concrete approach slab to provide a smooth grade transition between the bridge deck and the approach pavement. Investigating the geotechnical and structural performance of approach slabs and revealing the fundamental affecting factors have become mandatory. In this paper, a 2-D finite element model is employed to investigate the performance of approach slabs. Moreover, an extensive parametric study is carried out to appraise the relatively optimum geometries of approach slab, i.e. slab length, thickness, embedded depth and slope, that can yield permissible bumps. Different geo-mechanical conditions of the cohesive foundation soil and the fill material of the bridge embankment are examined.

  15. Measuring economy-wide energy efficiency performance: A parametric frontier approach

    International Nuclear Information System (INIS)

    Zhou, P.; Ang, B.W.; Zhou, D.Q.

    2012-01-01

    This paper proposes a parametric frontier approach to estimating economy-wide energy efficiency performance from a production efficiency point of view. It uses the Shephard energy distance function to define an energy efficiency index and adopts the stochastic frontier analysis technique to estimate the index. A case study of measuring the economy-wide energy efficiency performance of a sample of OECD countries using the proposed approach is presented. It is found that the proposed parametric frontier approach has higher discriminating power in energy efficiency performance measurement compared to its nonparametric frontier counterparts.

  16. Approach and methods to evaluate the uncertainty in system thermalhydraulic calculations

    International Nuclear Information System (INIS)

    D'Auria, F.

    2004-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. An activity in progress at the International Atomic Energy Agency (IAEA) is considered. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermalhydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  17. Comparative performance of different stochastic methods to simulate drug exposure and variability in a population.

    Science.gov (United States)

    Tam, Vincent H; Kabbara, Samer

    2006-10-01

    Monte Carlo simulations (MCSs) are increasingly being used to predict the pharmacokinetic variability of antimicrobials in a population. However, various MCS approaches may differ in the accuracy of the predictions. We compared the performance of 3 different MCS approaches using a data set with known parameter values and dispersion. Ten concentration-time profiles were randomly generated and used to determine the best-fit parameter estimates. Three MCS methods were subsequently used to simulate the AUC(0-infinity) of the population, using the central tendency and dispersion of the following in the subject sample: 1) K and V; 2) clearance and V; 3) AUC(0-infinity). In each scenario, 10000 subject simulations were performed. Compared to true AUC(0-infinity) of the population, mean biases by various methods were 1) 58.4, 2) 380.7, and 3) 12.5 mg h L(-1), respectively. Our results suggest that the most realistic MCS approach appeared to be based on the variability of AUC(0-infinity) in the subject sample.

  18. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    DEFF Research Database (Denmark)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna

    2013-01-01

    machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region......-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods...

  19. Towards Multi-Method Research Approach in Empirical Software Engineering

    Science.gov (United States)

    Mandić, Vladimir; Markkula, Jouni; Oivo, Markku

    This paper presents results of a literature analysis on Empirical Research Approaches in Software Engineering (SE). The analysis explores reasons why traditional methods, such as statistical hypothesis testing and experiment replication are weakly utilized in the field of SE. It appears that basic assumptions and preconditions of the traditional methods are contradicting the actual situation in the SE. Furthermore, we have identified main issues that should be considered by the researcher when selecting the research approach. In virtue of reasons for weak utilization of traditional methods we propose stronger use of Multi-Method approach with Pragmatism as the philosophical standpoint.

  20. Performance Evaluation of the Spectral Centroid Downshift Method for Attenuation Estimation

    OpenAIRE

    Samimi, Kayvan; Varghese, Tomy

    2015-01-01

    Estimation of frequency-dependent ultrasonic attenuation is an important aspect of tissue characterization. Along with other acoustic parameters studied in quantitative ultrasound, the attenuation coefficient can be used to differentiate normal and pathological tissue. The spectral centroid downshift (CDS) method is one the most common frequency-domain approaches applied to this problem. In this study, a statistical analysis of this method’s performance was carried out based on a parametric m...

  1. Building communities through performance: emerging approaches to interculturality.

    Science.gov (United States)

    Parent, Roger

    2009-08-01

    Changing definitions of culture are modifying approaches to intercultural education and training. This paper outlines the principal features of these emerging models for innovation and capacity building in communities. Semiotics provides a theoretical frame for the interdisciplinary analysis of research on cultural competency, especially regarding recent studies on "cultural intelligence", performance and creativity. Interdisciplinary research on cultural literacy is shifting from cultural knowledge to intercultural know-how. This know-how translates into the individual's capacity to innovate and illustrates the influence of culture on individual and group performance. Research on cultural intelligence, performance and creativity provides promising new models for capacity building in communities. These approaches constitute a synthesis of previous research on cultural competency and provide new avenues for innovative social action through intercultural exchange.

  2. Evaluation of a multi-methods approach to the collection and dissemination of feedback on OSCE performance in dental education.

    Science.gov (United States)

    Wardman, M J; Yorke, V C; Hallam, J L

    2018-05-01

    Feedback is an essential part of the learning process, and students expect their feedback to be personalised, meaningful and timely. Objective Structured Clinical Examination (OSCE) assessments allow examiners to observe students carefully over the course of a number of varied station types, across a number of clinical knowledge and skill domains. They therefore present an ideal opportunity to record detailed feedback which allows students to reflect on and improve their performance. This article outlines two methods by which OSCE feedback was collected and then disseminated to undergraduate dental students across 2-year groups in a UK dental school: (i) Individual written feedback comments made by examiners during the examination, (ii) General audio feedback recorded by groups of examiners immediately following the examination. Evaluation of the feedback was sought from students and staff examiners. A multi-methods approach utilising Likert questionnaire items (quantitative) and open-ended feedback questions (qualitative) was used. Data analysis explored student and staff perceptions of the audio and written feedback. A total of 131 students (response rate 68%) and 52 staff examiners (response rate 83%) completed questionnaires. Quantitative data analysis showed that the written and audio formats were reported as a meaningful source of feedback for learning by both students (93% written, 89% audio) and staff (96% written, 92% audio). Qualitative data revealed the complementary nature of both types of feedback. Written feedback gives specific, individual information whilst audio shares general observations and allows students to learn from others. The advantages, limitations and challenges of the feedback methods are discussed, leading to the development of an informed set of implementation guidelines. Written and audio feedback methods are valued by students and staff. It is proposed that these may be very easily applied to OSCEs running in other dental schools.

  3. Approaches to greenhouse gas accounting methods for biomass carbon

    International Nuclear Information System (INIS)

    Downie, Adriana; Lau, David; Cowie, Annette; Munroe, Paul

    2014-01-01

    This investigation examines different approaches for the GHG flux accounting of activities within a tight boundary of biomass C cycling, with scope limited to exclude all other aspects of the lifecycle. Alternative approaches are examined that a) account for all emissions including biogenic CO 2 cycling – the biogenic method; b) account for the quantity of C that is moved to and maintained in the non-atmospheric pool – the stock method; and c) assume that the net balance of C taken up by biomass is neutral over the short-term and hence there is no requirement to include this C in the calculation – the simplified method. This investigation demonstrates the inaccuracies in both emissions forecasting and abatement calculations that result from the use of the simplified method, which is commonly accepted for use. It has been found that the stock method is the most accurate and appropriate approach for use in calculating GHG inventories, however short-comings of this approach emerge when applied to abatement projects, as it does not account for the increase in biogenic CO 2 emissions that are generated when non-CO 2 GHG emissions in the business-as-usual case are offset. Therefore the biogenic method or a modified version of the stock method should be used to accurately estimate GHG emissions abatement achieved by a project. This investigation uses both the derivation of methodology equations from first principles and worked examples to explore the fundamental differences in the alternative approaches. Examples are developed for three project scenarios including; landfill, combustion and slow-pyrolysis (biochar) of biomass. -- Highlights: • Different approaches can be taken to account for the GHG emissions from biomass. • Simplification of GHG accounting methods is useful, however, can lead to inaccuracies. • Approaches used currently are often inadequate for practises that store carbon. • Accounting methods for emissions forecasting can be inadequate for

  4. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  5. Combining multiple FDG-PET radiotherapy target segmentation methods to reduce the effect of variable performance of individual segmentation methods

    Energy Technology Data Exchange (ETDEWEB)

    McGurk, Ross J. [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Bowsher, James; Das, Shiva K. [Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Lee, John A [Molecular Imaging and Experimental Radiotherapy Unit, Universite Catholique de Louvain, 1200 Brussels (Belgium)

    2013-04-15

    Purpose: Many approaches have been proposed to segment high uptake objects in 18F-fluoro-deoxy-glucose positron emission tomography images but none provides consistent performance across the large variety of imaging situations. This study investigates the use of two methods of combining individual segmentation methods to reduce the impact of inconsistent performance of the individual methods: simple majority voting and probabilistic estimation. Methods: The National Electrical Manufacturers Association image quality phantom containing five glass spheres with diameters 13-37 mm and two irregularly shaped volumes (16 and 32 cc) formed by deforming high-density polyethylene bottles in a hot water bath were filled with 18-fluoro-deoxyglucose and iodine contrast agent. Repeated 5-min positron emission tomography (PET) images were acquired at 4:1 and 8:1 object-to-background contrasts for spherical objects and 4.5:1 and 9:1 for irregular objects. Five individual methods were used to segment each object: 40% thresholding, adaptive thresholding, k-means clustering, seeded region-growing, and a gradient based method. Volumes were combined using a majority vote (MJV) or Simultaneous Truth And Performance Level Estimate (STAPLE) method. Accuracy of segmentations relative to CT ground truth volumes were assessed using the Dice similarity coefficient (DSC) and the symmetric mean absolute surface distances (SMASDs). Results: MJV had median DSC values of 0.886 and 0.875; and SMASD of 0.52 and 0.71 mm for spheres and irregular shapes, respectively. STAPLE provided similar results with median DSC of 0.886 and 0.871; and median SMASD of 0.50 and 0.72 mm for spheres and irregular shapes, respectively. STAPLE had significantly higher DSC and lower SMASD values than MJV for spheres (DSC, p < 0.0001; SMASD, p= 0.0101) but MJV had significantly higher DSC and lower SMASD values compared to STAPLE for irregular shapes (DSC, p < 0.0001; SMASD, p= 0.0027). DSC was not significantly

  6. Socratic Method as an Approach to Teaching

    Directory of Open Access Journals (Sweden)

    Haris Delić

    2016-10-01

    Full Text Available In this article we presented the theoretical view of Socrates' life and his method in teaching. After the biographical facts of Socrates and his life, we explained the method he used in teaching and the two main types of his method, Classic and Modern Socratic Method. Since the core of Socrates' approach is the dialogue as a form of teaching we explained how exactly the Socratic dialogue goes. Besides that, we presented two examples of dialogues that Socrates led, Meno and Gorgias. Socratic circle is also one of the aspects that we presented in this paper. It is the form of seminars that is crucial for group discussions of a given theme. At the end, some disadvantages of the Method are explained. With this paper, the reader can get the conception of this approach of teaching and can use Socrates as an example of how successfull teacher leads his students towards the goal.

  7. A suggested approach toward measuring sorption and applying sorption data to repository performance assessment

    International Nuclear Information System (INIS)

    Rundberg, R.S.

    1992-01-01

    The prediction of radionuclide migration for the purpose of assessing the safety of a nuclear waste repository will be based on a collective knowledge of hydrologic and geochemical properties of the surrounding rock and groundwater. This knowledge along with assumption about the interactions of radionuclides with groundwater and minerals form the scientific basis for a model capable of accurately predicting the repository's performance. Because the interaction of radionuclides in geochemical systems is known to be complicated, several fundamental and empirical approaches to measuring the interaction between radionuclides and the geologic barrier have been developed. The approaches applied to the measurement of sorption involve the use of pure minerals, intact, or crushed rock in dynamic and static experiments. Each approach has its advantages and disadvantages. There is no single best method for providing sorption data for performance assessment models which can be applied without invoking information derived from multiple experiments. 53 refs., 12 figs

  8. Parallelised Krylov subspace method for reactor kinetics by IQS approach

    International Nuclear Information System (INIS)

    Gupta, Anurag; Modak, R.S.; Gupta, H.P.; Kumar, Vinod; Bhatt, K.

    2005-01-01

    Nuclear reactor kinetics involves numerical solution of space-time-dependent multi-group neutron diffusion equation. Two distinct approaches exist for this purpose: the direct (implicit time differencing) approach and the improved quasi-static (IQS) approach. Both the approaches need solution of static space-energy-dependent diffusion equations at successive time-steps; the step being relatively smaller for the direct approach. These solutions are usually obtained by Gauss-Seidel type iterative methods. For a faster solution, the Krylov sub-space methods have been tried and also parallelised by many investigators. However, these studies seem to have been done only for the direct approach. In the present paper, parallelised Krylov methods are applied to the IQS approach in addition to the direct approach. It is shown that the speed-up obtained for IQS is higher than that for the direct approach. The reasons for this are also discussed. Thus, the use of IQS approach along with parallelised Krylov solvers seems to be a promising scheme

  9. A systematic method for characterizing the time-range performance of ground penetrating radar

    International Nuclear Information System (INIS)

    Strange, A D

    2013-01-01

    The fundamental performance of ground penetrating radar (GPR) is linked to the ability to measure the signal time-of-flight in order to provide an accurate radar-to-target range estimate. Having knowledge of the actual time range and timing nonlinearities of a trace is therefore important when seeking to make quantitative range estimates. However, very few practical methods have been formally reported in the literature to characterize GPR time-range performance. This paper describes a method to accurately measure the true time range of a GPR to provide a quantitative assessment of the timing system performance and detect and quantify the effects of timing nonlinearity due to timing jitter. The effect of varying the number of samples per trace on the true time range has also been investigated and recommendations on how to minimize the effects of timing errors are described. The approach has been practically applied to characterize the timing performance of two commercial GPR systems. The importance of the method is that it provides the GPR community with a practical method to readily characterize the underlying accuracy of GPR systems. This in turn leads to enhanced target depth estimation as well as facilitating the accuracy of more sophisticated GPR signal processing methods. (paper)

  10. Performance Assessment Method for a Forged Fingerprint Detection Algorithm

    Science.gov (United States)

    Shin, Yong Nyuo; Jun, In-Kyung; Kim, Hyun; Shin, Woochang

    The threat of invasion of privacy and of the illegal appropriation of information both increase with the expansion of the biometrics service environment to open systems. However, while certificates or smart cards can easily be cancelled and reissued if found to be missing, there is no way to recover the unique biometric information of an individual following a security breach. With the recognition that this threat factor may disrupt the large-scale civil service operations approaching implementation, such as electronic ID cards and e-Government systems, many agencies and vendors around the world continue to develop forged fingerprint detection technology, but no objective performance assessment method has, to date, been reported. Therefore, in this paper, we propose a methodology designed to evaluate the objective performance of the forged fingerprint detection technology that is currently attracting a great deal of attention.

  11. A mixed-methods approach to systematic reviews.

    Science.gov (United States)

    Pearson, Alan; White, Heath; Bath-Hextall, Fiona; Salmond, Susan; Apostolo, Joao; Kirkpatrick, Pamela

    2015-09-01

    There are an increasing number of published single-method systematic reviews that focus on different types of evidence related to a particular topic. As policy makers and practitioners seek clear directions for decision-making from systematic reviews, it is likely that it will be increasingly difficult for them to identify 'what to do' if they are required to find and understand a plethora of syntheses related to a particular topic.Mixed-methods systematic reviews are designed to address this issue and have the potential to produce systematic reviews of direct relevance to policy makers and practitioners.On the basis of the recommendations of the Joanna Briggs Institute International Mixed Methods Reviews Methodology Group in 2012, the Institute adopted a segregated approach to mixed-methods synthesis as described by Sandelowski et al., which consists of separate syntheses of each component method of the review. Joanna Briggs Institute's mixed-methods synthesis of the findings of the separate syntheses uses a Bayesian approach to translate the findings of the initial quantitative synthesis into qualitative themes and pooling these with the findings of the initial qualitative synthesis.

  12. Nutrition and culture in professional football. A mixed method approach.

    Science.gov (United States)

    Ono, Mutsumi; Kennedy, Eileen; Reeves, Sue; Cronin, Linda

    2012-02-01

    An adequate diet is essential for the optimal performance of professional football (soccer) players. Existing studies have shown that players fail to consume such a diet, without interrogating the reasons for this. The aim of this study was to explore the difficulties professional football players experience in consuming a diet for optimal performance. It utilized a mixed method approach, combining nutritional intake assessment with qualitative interviews, to ascertain both what was consumed and the wider cultural factors that affect consumption. The study found a high variability in individual intake which ranged widely from 2648 to 4606 kcal/day. In addition, the intake of carbohydrate was significantly lower than that recommended. The study revealed that the main food choices for carbohydrate and protein intake were pasta and chicken respectively. Interview results showed the importance of tradition within the world of professional football in structuring the players' approach to nutrition. In addition, the players' personal eating habits that derived from their class and national habitus restricted their food choice by conflicting with the dietary choices promoted within the professional football clubs. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Methods of counting ribs on chest CT: the modified sternomanubrial approach

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Kyung Sik; Kim, Sung Jin; Jeon, Min Hee; Lee, Seung Young; Bae, Il Hun [Chungbuk National University, Cheongju (Korea, Republic of)

    2007-08-15

    The purpose of this study was to evaluate the accuracy of each method of counting ribs on chest CT and to propose a new method: the anterior approach with using the sternocostal joints. CT scans of 38 rib lesions of 27 patients were analyzed (fracture: 25, metastasis: 11, benign bone disease: 2). Each lesion was independently counted by three radiologists with using three different methods for counting ribs: the sternoclavicular approach, the xiphisternal approach and the modified sternomanubrial approach. The rib lesions were divided into three parts of evaluation of each method according to the location of the lesion as follows: the upper part (between the first and fourth thoracic vertebra), the middle part (between the fifth and eighth) and the lower part (between the ninth and twelfth). The most accurate method was a modified sternomanubrial approach (99.1%). The accuracies of a xiphisternal approach and a sternoclavicular approach were 95.6% and 88.6%, respectively. A modified sternomanubrial approach showed the highest accuracies in all three parts (100%, 100% and 97.9%, respectively). We propose a new method for counting ribs, the modified sternomanubrial approach, which was more accurate than the known methods in any parts of the bony thorax, and it may be an easier and quicker method than the others in clinical practice.

  14. Blended Approach to Occupational Performance (BAOP: Guidelines Enabling Children with Autism

    Directory of Open Access Journals (Sweden)

    Jordan M. Skowronski

    2017-01-01

    Full Text Available The performance of daily activities is impacted by motor impairments in children with autism spectrum disorders (ASD. Research has recently demonstrated the prevalence and specificity of motor impairments in people with ASD. The motor learning of individuals with ASD is partially intact, and evidence suggests that a method to alter skill learning and repeated practice of motor sequences might be beneficial. Aiming to use this knowledge to guide occupational therapy interventions, initial guidelines for children with ASD blended Cognitive Orientation to daily Occupational Performance (CO-OP with virtual reality (VR were created. An expert panel reviewed the initial guidelines. The results from the semi-structured expert panel discussion were to (a increase the number of sessions, (b provide more visuals to children, and (c use VR as a reinforcer. Guidelines were revised accordingly. The revised guidelines, called Blended Approach to Occupational Performance (BAOP, are ready for further testing.

  15. Comparison of Different Approaches to Predict the Performance of Pumps As Turbines (PATs

    Directory of Open Access Journals (Sweden)

    Mauro Venturini

    2018-04-01

    Full Text Available This paper deals with the comparison of different methods which can be used for the prediction of the performance curves of pumps as turbines (PATs. The considered approaches are four, i.e., one physics-based simulation model (“white box” model, two “gray box” models, which integrate theory on turbomachines with specific data correlations, and one “black box” model. More in detail, the modeling approaches are: (1 a physics-based simulation model developed by the same authors, which includes the equations for estimating head, power, and efficiency and uses loss coefficients and specific parameters; (2 a model developed by Derakhshan and Nourbakhsh, which first predicts the best efficiency point of a PAT and then reconstructs their complete characteristic curves by means of two ad hoc equations; (3 the prediction model developed by Singh and Nestmann, which predicts the complete turbine characteristics based on pump shape and size; (4 an Evolutionary Polynomial Regression model, which represents a data-driven hybrid scheme which can be used for identifying the explicit mathematical relationship between PAT and pump curves. All approaches are applied to literature data, relying on both pump and PAT performance curves of head, power, and efficiency over the entire range of operation. The experimental data were provided by Derakhshan and Nourbakhsh for four different turbomachines, working in both pump and PAT mode with specific speed values in the range 1.53–5.82. This paper provides a quantitative assessment of the predictions made by means of the considered approaches and also analyzes consistency from a physical point of view. Advantages and drawbacks of each method are also analyzed and discussed.

  16. Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study

    Science.gov (United States)

    2013-01-01

    Background Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM. Methods A mixed-methods study was conducted consisting of a controlled trial and focus groups with second year graduate medical students. Students received the EBM course delivered using either a didactic approach (DID) to learning EBM or a blended-learning approach (BL). Student competency in EBM was assessed using the Berlin tool and a criterion-based assessment task, with student perceptions on the interventions assessed qualitatively. Results A total of 61 students (85.9%) participated in the study. Competency in EBM did not differ between the groups when assessed using the Berlin tool (p = 0.29). Students using the BL approach performed significantly better in one of the criterion-based assessment tasks (p = 0.01) and reported significantly higher self-perceived competence in critical appraisal skills. Qualitative analysis identified that students had a preference for the EBM course to be delivered using the BL approach. Conclusions Implementing a blended-learning approach to EBM teaching promotes greater student appreciation of EBM principles within the clinical setting. Integrating a variety of teaching modalities and approaches can increase student self-confidence and assist in bridging the gap between the theory and practice of EBM. PMID:24341502

  17. Evaluating operator performance on full-scope simulators: A pragmatic approach to an intractable measurement problem

    International Nuclear Information System (INIS)

    Fuld, R.

    1989-01-01

    Industry trends toward full-scope, plant-referenced control room simulators have accelerated. The cost of such training is high, but the cost of training ineffectiveness is even higher if it permits serious errors or operator disqualification to occur. Effective measures of operator performance are needed, but the complexity of the task environment and the many aspects of and requirements for operator performance conspire to make such measurement a challenging problem. Combustion Engineering (C-E) Owners' Group task No. 572 was undertaken to develop a tractable and effective methodology for evaluating team performance in a requalification context on full-scope simulator scenarios. The following concepts were pursued as design goals for the method: 1. validity; 2. sensitivity; 3. reliability; 4. usability. In addition, the resulting approach was to meet the requirements of ES-601, Implementation Guidance of the NRC for Administration of Requalifying Exams. A survey of existing evaluation tools and techniques was made to determine the strengths and weaknesses of each. Based on those findings, a multimethod approach was developed drawing on the combined strengths of several general methods. The paper discusses procedural milestones, comments as subjective ratings, failure criteria, and tracked plant parameters

  18. Development of exploratory approach for scenario analysis in the performance assessment of geological disposal

    International Nuclear Information System (INIS)

    Makino, Hitoshi; Ishiguro, Katsuhiko; Umeki, Hiroyuki; Oyamada, Kiyoshi; Takase, Hiroyasu; Grindrod, Peter

    1998-01-01

    It becomes difficult to apply the ordinary method for scenario analysis as number of the processes and complexity in their interrelations are increased. For this problem, an exploratory approach, that can perform scenario analysis on wider range of problems, was developed. The approach includes ensemble runs of a mass transport model, that was developed as a generic and flexible model and can cover effects of various processes on the mass transport, and analysis of sensitivity structure among the input and output space of the ensemble runs. The technique of clustering and principal component analysis were applied in the approach. As the result of its test application, applicability of the approach was confirmed to identify important processes from number of the processes in the systematic and objective manner. (author)

  19. PERFORMANCE EVALUATION OF TURKISH TYPE A MUTUAL FUNDS AND PENSION STOCK FUNDS BY USING TOPSIS METHOD

    Directory of Open Access Journals (Sweden)

    Nesrin ALPTEKIN

    2009-07-01

    Full Text Available In this paper, it is evaluated performance of Turkish Type A mutual funds and pension stock funds by using TOPSIS method which is a multicriteria decision making approach. Both of these funds compose of stocks in their portfolios, so it can be enabled to compare each other. Generally, mutual or pension funds are evaluated according to their risk and return. At this point, it is used traditional performance measurement techniques of funds like Sharpe ratio, Sortino ratio, Treynor index and Jensen’s alpha. TOPSIS method takes into consideration all of these fund performance measurement techniques and provides more reasonable performance measurement.

  20. Coupling model and solving approach for performance evaluation of natural draft counter-flow wet cooling towers

    Directory of Open Access Journals (Sweden)

    Wang Wei

    2016-01-01

    Full Text Available When searching for the optimum condenser cooling water flow in a thermal power plant with natural draft cooling towers, it is essential to evaluate the outlet water temperature of cooling towers when the cooling water flow and inlet water temperature change. However, the air outlet temperature and tower draft or inlet air velocity are strongly coupled for natural draft cooling towers. Traditional methods, such as trial and error method, graphic method and iterative methods are not simple and efficient enough to be used for plant practice. In this paper, we combine Merkel equation with draft equation, and develop the coupled description for performance evaluation of natural draft cooling towers. This model contains two inputs: the cooling water flow, the inlet cooling water temperature and two outputs: the outlet water temperature, the inlet air velocity, equivalent to tower draft. In this model, we furthermore put forward a soft-sensing algorithm to calculate the total drag coefficient instead of empirical correlations. Finally, we design an iterative approach to solve this coupling model, and illustrate three cases to prove that the coupling model and solving approach proposed in our paper are effective for cooling tower performance evaluation.

  1. The strategic selecting criteria and performance by using the multiple criteria method

    Directory of Open Access Journals (Sweden)

    Lisa Y. Chen

    2008-02-01

    Full Text Available As the increasing competitive intensity in the current service market, organizational capabilities have been recognized as the importance of sustaining competitive advantage. The profitable growth for the firms has been fueled a need to systematically assess and renew the organization. The purpose of this study is to analyze the financial performance of the firms to create an effective evaluating structure for the Taiwan's service industry. This study utilized TOPSIS (technique for order preference by similarity to ideal solution method to evaluate the operating performance of 12 companies. TOPSIS is a multiple criteria decision making method to identify solutions from a finite set of alternatives based upon simultaneous minimization of distance from an ideal point and maximization of distance from a nadir point. By using this approach, this study measures the financial performance of firms through two aspects and ten indicators. The result indicated e-life had outstanding performance among the 12 retailers. The findings of this study provided managers to better understand their market position, competition, and profitability for future strategic planning and operational management.

  2. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    Science.gov (United States)

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods

  3. A new approach to performance assessment of barriers in a repository. Executive summary, draft, technical appendices. Final report

    International Nuclear Information System (INIS)

    Mueller-Hoeppe, N.; Krone, J.; Niehues, N.; Poehler, M.; Raitz von Frentz, R.; Gauglitz, R.

    1999-06-01

    Multi-barrier systems are accepted as the basic approach for long term environmental safe isolation of radioactive waste in geological repositories. Assessing the performance of natural and engineered barriers is one of the major difficulties in producing evidence of environmental safety for any radioactive waste disposal facility, due to the enormous complexity of scenarios and uncertainties to be considered. This report outlines a new methodological approach originally developed basically for a repository in salt, but that can be transferred with minor modifications to any other host rock formation. The approach is based on the integration of following elements: (1) Implementation of a simple method and efficient criteria to assess and prove the tightness of geological and engineered barriers; (2) Using the method of Partial Safety Factors in order to assess barrier performance at certain reasonable level of confidence; (3) Integration of a diverse geochemical barrier in the near field of waste emplacement limiting systematically the radiological consequences from any radionuclide release in safety investigations and (4) Risk based approach for the assessment of radionuclide releases. Indicative calculations performed with extremely conservative assumptions allow to exclude any radiological health consequences from a HLW repository in salt to a reference person with a safety level of 99,9999% per year. (orig.)

  4. A novel hybrid MCDM model for performance evaluation of research and technology organizations based on BSC approach.

    Science.gov (United States)

    Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi

    2016-10-01

    Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Measuring energy performance with sectoral heterogeneity: A non-parametric frontier approach

    International Nuclear Information System (INIS)

    Wang, H.; Ang, B.W.; Wang, Q.W.; Zhou, P.

    2017-01-01

    Evaluating economy-wide energy performance is an integral part of assessing the effectiveness of a country's energy efficiency policy. Non-parametric frontier approach has been widely used by researchers for such a purpose. This paper proposes an extended non-parametric frontier approach to studying economy-wide energy efficiency and productivity performances by accounting for sectoral heterogeneity. Relevant techniques in index number theory are incorporated to quantify the driving forces behind changes in the economy-wide energy productivity index. The proposed approach facilitates flexible modelling of different sectors' production processes, and helps to examine sectors' impact on the aggregate energy performance. A case study of China's economy-wide energy efficiency and productivity performances in its 11th five-year plan period (2006–2010) is presented. It is found that sectoral heterogeneities in terms of energy performance are significant in China. Meanwhile, China's economy-wide energy productivity increased slightly during the study period, mainly driven by the technical efficiency improvement. A number of other findings have also been reported. - Highlights: • We model economy-wide energy performance by considering sectoral heterogeneity. • The proposed approach can identify sectors' impact on the aggregate energy performance. • Obvious sectoral heterogeneities are identified in evaluating China's energy performance.

  6. A Neuro-Fuzzy Approach in the Classification of Students’ Academic Performance

    Directory of Open Access Journals (Sweden)

    Quang Hung Do

    2013-01-01

    Full Text Available Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions.

  7. Approaches for University Students and their Relationship to Academic Performance

    Directory of Open Access Journals (Sweden)

    Evelyn Fernández-Castillo

    2015-05-01

    Full Text Available The way students perceive learning is influenced by multiple factors. The present study aimed at establishing relationships between the learning approaches, academic performance, and the academic year in a sample of students from different courses of Universidad Central  “Marta Abreu”, Las Villas. For this ex post facto study, a probabilistic sample was used based on a simple random sampling of 524 university students who participated in the Study Process Questionnaire.  The analysis of variance (MANOVA and ANOVA and the analysis of clusters reported associations between a deep approach to learning and a better academic performance.  These analyses showed differences in the learning approach in the different courses, predominantly a soft approach.

  8. EVALUATION OF WOOD PERFORMANCE IN BUILDING CONSTRUCTION THROUGH SYSTEM APPROACH

    Directory of Open Access Journals (Sweden)

    Ricardo Pedreschi

    2005-09-01

    Full Text Available Building construction is considered to be the leading market for the wood industry, in developed and developingcountries. The greatest amount of wood produced in Brazil is consumed as firewood and energy, followed by production of celluloseand third as machined wood. The use of wood from planted forests can be increased. This would lead to a better use of naturalresources, and consequently to an increased sustainability of forest activity in many regions of the country. The performance of woodcan be observed from many different insights: symbolic performance, technical performance and economical performance, conductedby the method of systems approach to architecture. Usages of wood related to the performances of the material, with the redefinitionof parameters of use, elaborating a new culture linked to new technologies were outlined. This work diagnosed the usage of wood inbuilding construction based in system analysis. Through an opinion research related to the acceptation of the use of wood we observethe possibilities of utilization according to physical and mechanical proprieties, aesthetics and appearance performance and postoccupation.According to the results obtained related to the culture and knowledge about the use of wood from planted forest, it canconclude that there is not enough knowledge in this area, and it is, therefore, necessary to create an information system forprofessionals and for people in general.

  9. Paper Prototyping: The Surplus Merit of a Multi-Method Approach

    Directory of Open Access Journals (Sweden)

    Stephanie Bettina Linek

    2015-07-01

    Full Text Available This article describes a multi-method approach for usability testing. The approach combines paper prototyping and think-aloud with two supplemental methods: advanced scribbling and a handicraft task. The method of advanced scribbling instructs the participants to use different colors for marking important, unnecessary and confusing elements in a paper prototype. In the handicraft task the participants have to tinker a paper prototype of their wish version. Both methods deliver additional information on the needs and expectations of the potential users and provide helpful indicators for clarifying complex or contradictory findings. The multi-method approach and its surplus benefit are illustrated by a pilot study on the redesign of the homepage of a library 2.0. The findings provide positive evidence for the applicability of the advanced scribbling and the handicraft task as well as for the surplus merit of the multi-method approach. The article closes with a discussion and outlook. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs150379

  10. Critical factors in the empirical performance of temporal difference and evolutionary methods for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Taylor, M.E.; Stone, P.

    2010-01-01

    Temporal difference and evolutionary methods are two of the most common approaches to solving reinforcement learning problems. However, there is little consensus on their relative merits and there have been few empirical studies that directly compare their performance. This article aims to address

  11. Performance of some numerical Laplace inversion methods on American put option formula

    Science.gov (United States)

    Octaviano, I.; Yuniar, A. R.; Anisa, L.; Surjanto, S. D.; Putri, E. R. M.

    2018-03-01

    Numerical inversion approaches of Laplace transform is used to obtain a semianalytic solution. Some of the mathematical inversion methods such as Durbin-Crump, Widder, and Papoulis can be used to calculate American put options through the optimal exercise price in the Laplace space. The comparison of methods on some simple functions is aimed to know the accuracy and parameters which used in the calculation of American put options. The result obtained is the performance of each method regarding accuracy and computational speed. The Durbin-Crump method has an average error relative of 2.006e-004 with computational speed of 0.04871 seconds, the Widder method has an average error relative of 0.0048 with computational speed of 3.100181 seconds, and the Papoulis method has an average error relative of 9.8558e-004 with computational speed of 0.020793 seconds.

  12. A combined approach for the enhancement and segmentation of mammograms using modified fuzzy C-means method in wavelet domain.

    Science.gov (United States)

    Srivastava, Subodh; Sharma, Neeraj; Singh, S K; Srivastava, R

    2014-07-01

    In this paper, a combined approach for enhancement and segmentation of mammograms is proposed. In preprocessing stage, a contrast limited adaptive histogram equalization (CLAHE) method is applied to obtain the better contrast mammograms. After this, the proposed combined methods are applied. In the first step of the proposed approach, a two dimensional (2D) discrete wavelet transform (DWT) is applied to all the input images. In the second step, a proposed nonlinear complex diffusion based unsharp masking and crispening method is applied on the approximation coefficients of the wavelet transformed images to further highlight the abnormalities such as micro-calcifications, tumours, etc., to reduce the false positives (FPs). Thirdly, a modified fuzzy c-means (FCM) segmentation method is applied on the output of the second step. In the modified FCM method, the mutual information is proposed as a similarity measure in place of conventional Euclidian distance based dissimilarity measure for FCM segmentation. Finally, the inverse 2D-DWT is applied. The efficacy of the proposed unsharp masking and crispening method for image enhancement is evaluated in terms of signal-to-noise ratio (SNR) and that of the proposed segmentation method is evaluated in terms of random index (RI), global consistency error (GCE), and variation of information (VoI). The performance of the proposed segmentation approach is compared with the other commonly used segmentation approaches such as Otsu's thresholding, texture based, k-means, and FCM clustering as well as thresholding. From the obtained results, it is observed that the proposed segmentation approach performs better and takes lesser processing time in comparison to the standard FCM and other segmentation methods in consideration.

  13. Optimization of the gas turbine-modular helium reactor using statistical methods to maximize performance without compromising system design margins

    International Nuclear Information System (INIS)

    Lommers, L.J.; Parme, L.L.; Shenoy, A.S.

    1995-07-01

    This paper describes a statistical approach for determining the impact of system performance and design uncertainties on power plant performance. The objectives of this design approach are to ensure that adequate margin is provided, that excess margin is minimized, and that full advantage can be taken of unconsumed margin. It is applicable to any thermal system in which these factors are important. The method is demonstrated using the Gas Turbine Modular Helium Reactor as an example. The quantitative approach described allows the characterization of plant performance and the specification of the system design requirements necessary to achieve the desired performance with high confidence. Performance variations due to design evolution, inservice degradation, and basic performance uncertainties are considered. The impact of all performance variabilities is combined using Monte Carlo analysis to predict the range of expected operation

  14. Performance improvement integration: a whole systems approach.

    Science.gov (United States)

    Page, C K

    1999-02-01

    Performance improvement integration in health care organizations is a challenge for health care leaders. Required for accreditation by the Joint Commission on Accreditation of Healthcare Organizations (Joint Commission), performance improvement (PI) can be designed as a sustainable model for performance to survive in a turbulent period. Central Baptist Hospital developed a model for PI that focused on strategy established by the leadership team, delineated responsibility through the organizational structure of shared governance, and accountability for outcomes evidenced through the organization's profitability. Such an approach integrated into the culture of the organization can produce positive financial margins, positive customer satisfaction, and commendations from the Joint Commission.

  15. Application of controllable unit approach (CUA) to performance-criterion-based nuclear material control and accounting

    International Nuclear Information System (INIS)

    Foster, K.W.; Rogers, D.R.

    1979-01-01

    The Nuclear Regulatory Commission is considering the use of maximum-loss performance criteria as a means of controlling SNM in nuclear plants. The Controllable Unit Approach to material control and accounting (CUA) was developed by Mound to determine the feasibility of controlling a plant to a performance criterion. The concept was tested with the proposed Anderson, SC, mixed-oxide plant, and it was shown that CUA is indeed a feasible method for controlling a complex process to a performance criterion. The application of CUA to an actual low-enrichment plant to assist the NRC in establishing performance criteria for uranium processes is discussed. 5 refs

  16. Performing Systematic Literature Reviews with Novices: An Iterative Approach

    Science.gov (United States)

    Lavallée, Mathieu; Robillard, Pierre-N.; Mirsalari, Reza

    2014-01-01

    Reviewers performing systematic literature reviews require understanding of the review process and of the knowledge domain. This paper presents an iterative approach for conducting systematic literature reviews that addresses the problems faced by reviewers who are novices in one or both levels of understanding. This approach is derived from…

  17. Transactional approach in assessment of operational performance of companies in transport infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Dubrovsky, V.; Yaroshevich, N.; Kuzmin, E.

    2016-07-01

    Offer an alternative method to assess operational performance of companies in transport infrastructure of a region by making a comparison between transaction costs. The method is supposed to be a cross-functional and possibly applied to an analysis of economic entities of a different order (country, region, sector, companies) while evaluating “viscosity” / complexity of the outside and the inside. The paper includes an analysis of various methodological approaches to assess a development level of the transport infrastructure in a region. Within the author's approach and for purposed of the research, an index of transaction capacity or the transactionalness index is proposed, which determines a level of transaction costs calculated against the cost of production and revenue. The approach is piloted using the region-wise consolidated financial data of companies involved in the Russian transport infrastructure for 2005/2013. The proposed alternative way to measure corporate operating efficiency has proved its academic consistency. A specific comparison between the transaction costs using the transactionalness index allows first to identify companies or regions/sectors, where there is excess complexity of economical communication in bargaining. Secondly, the index does not only point out indirectly to a degree of development in the institutional environment, but also the infrastructure (the transport one in the example given). Third, the transactionalness level may say of uncertainty and risks. As an addition to theoretical and methodological aspects of transaction costs, the authors justify an approach to their size estimation, as well as their differentiation dividing them into two groups: those of a natural type and a background type. In a course of their discussion, the authors have concluded that there are such transaction costs in place, which are standard in a manner of speaking. There is a discussion whether it is scientifically reasonable to use an

  18. Framework for benchmarking online retailing performance using fuzzy AHP and TOPSIS method

    Directory of Open Access Journals (Sweden)

    M. Ahsan Akhtar Hasin

    2012-08-01

    Full Text Available Due to increasing penetration of internet connectivity, on-line retail is growing from the pioneer phase to increasing integration within people's lives and companies' normal business practices. In the increasingly competitive environment, on-line retail service providers require systematic and structured approach to have cutting edge over the rival. Thus, the use of benchmarking has become indispensable to accomplish superior performance to support the on-line retail service providers. This paper uses the fuzzy analytic hierarchy process (FAHP approach to support a generic on-line retail benchmarking process. Critical success factors for on-line retail service have been identified from a structured questionnaire and literature and prioritized using fuzzy AHP. Using these critical success factors, performance levels of the ORENET an on-line retail service provider is benchmarked along with four other on-line service providers using TOPSIS method. Based on the benchmark, their relative ranking has also been illustrated.

  19. An approach for prediction of petroleum production facility performance considering Arctic influence factors

    International Nuclear Information System (INIS)

    Gao Xueli; Barabady, Javad; Markeset, Tore

    2010-01-01

    As the oil and gas (O and G) industry is increasing the focus on petroleum exploration and development in the Arctic region, it is becoming increasingly important to design exploration and production facilities to suit the local operating conditions. The cold and harsh climate, the long distance from customer and suppliers' markets, and the sensitive environment may have considerable influence on the choice of design solutions and production performance characteristics such as throughput capacity, reliability, availability, maintainability, and supportability (RAMS) as well as operational and maintenance activities. Due to this, data and information collected for similar systems used in a normal climate may not be suitable. Hence, it is important to study and develop methods for prediction of the production performance characteristics during the design and operation phases. The aim of this paper is to present an approach for prediction of the production performance for oil and gas production facilities considering influencing factors in Arctic conditions. The proportional repair model (PRM) is developed in order to predict repair rate in Arctic conditions. The model is based on the proportional hazard model (PHM). A simple case study is used to demonstrate how the proposed approach can be applied.

  20. Direct torque control method applied to the WECS based on the PMSG and controlled with backstepping approach

    Science.gov (United States)

    Errami, Youssef; Obbadi, Abdellatif; Sahnoun, Smail; Ouassaid, Mohammed; Maaroufi, Mohamed

    2018-05-01

    This paper proposes a Direct Torque Control (DTC) method for Wind Power System (WPS) based Permanent Magnet Synchronous Generator (PMSG) and Backstepping approach. In this work, generator side and grid-side converter with filter are used as the interface between the wind turbine and grid. Backstepping approach demonstrates great performance in complicated nonlinear systems control such as WPS. So, the control method combines the DTC to achieve Maximum Power Point Tracking (MPPT) and Backstepping approach to sustain the DC-bus voltage and to regulate the grid-side power factor. In addition, control strategy is developed in the sense of Lyapunov stability theorem for the WPS. Simulation results using MATLAB/Simulink validate the effectiveness of the proposed controllers.

  1. A method for optimizing the performance of buildings

    DEFF Research Database (Denmark)

    Pedersen, Frank

    2007-01-01

    needed for solving the optimization problem. Furthermore, the algorithm uses so-called domain constraint functions in order to ensure that the input to the simulation software is feasible. Using this technique avoids performing time-consuming simulations for unrealistic design decisions. The algorithm......This thesis describes a method for optimizing the performance of buildings. Design decisions made in early stages of the building design process have a significant impact on the performance of buildings, for instance, the performance with respect to the energy consumption, economical aspects......, and the indoor environment. The method is intended for supporting design decisions for buildings, by combining methods for calculating the performance of buildings with numerical optimization methods. The method is able to find optimum values of decision variables representing different features of the building...

  2. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Directory of Open Access Journals (Sweden)

    Jordi Marcé-Nogué

    2017-10-01

    Full Text Available Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches.

  3. The intervals method: a new approach to analyse finite element outputs using multivariate statistics

    Science.gov (United States)

    De Esteban-Trivigno, Soledad; Püschel, Thomas A.; Fortuny, Josep

    2017-01-01

    Background In this paper, we propose a new method, named the intervals’ method, to analyse data from finite element models in a comparative multivariate framework. As a case study, several armadillo mandibles are analysed, showing that the proposed method is useful to distinguish and characterise biomechanical differences related to diet/ecomorphology. Methods The intervals’ method consists of generating a set of variables, each one defined by an interval of stress values. Each variable is expressed as a percentage of the area of the mandible occupied by those stress values. Afterwards these newly generated variables can be analysed using multivariate methods. Results Applying this novel method to the biological case study of whether armadillo mandibles differ according to dietary groups, we show that the intervals’ method is a powerful tool to characterize biomechanical performance and how this relates to different diets. This allows us to positively discriminate between specialist and generalist species. Discussion We show that the proposed approach is a useful methodology not affected by the characteristics of the finite element mesh. Additionally, the positive discriminating results obtained when analysing a difficult case study suggest that the proposed method could be a very useful tool for comparative studies in finite element analysis using multivariate statistical approaches. PMID:29043107

  4. An integrated lean-methods approach to hospital facilities redesign.

    Science.gov (United States)

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  5. A Mixed-Method Approach on Digital Educational Games for K12: Gender, Attitudes and Performance

    Science.gov (United States)

    Law, Effie Lai-Chong; Gamble, Tim; Schwarz, Daniel; Kickmeier-Rust, Michael D.; Holzinger, Andreas

    Research on the influence of gender on attitudes towards and performance in digital educational games (DEGs) has quite a long history. Generally, males tend to play such games more engagingly than females, consequently attitude and performance of males using DEGs should be presumably higher than that of females. This paper reports an investigation of a DEG, which was developed to enhance the acquisition of geographical knowledge, carried out on British, German and Austrian K12 students aged between 11 and 14. Methods include a survey on initial design concepts, user tests on the system and two single-gender focus groups. Gender and cultural differences in gameplay habit, game type preferences and game character perceptions were observed. The results showed that both genders similarly improved their geographical knowledge, although boys tended to have a higher level of positive user experience than the girls. The qualitative data from the focus groups illustrated some interesting gender differences in perceiving various aspects of the game.

  6. Systems engineering approach towards performance monitoring of emergency diesel generator

    International Nuclear Information System (INIS)

    Nurhayati Ramli; Lee, Y.K.

    2013-01-01

    Full-text: Systems engineering is an interdisciplinary approach and means to enable the realization of successful systems. In this study, systems engineering approach towards the performance monitoring of Emergency Diesel Generator (EDG) is presented. Performance monitoring is part and parcel of predictive maintenance where the systems and components conditions can be detected before they result into failures. In an effort to identify the proposal for addressing performance monitoring, the EDG boundary has been defined. Based on the Probabilistic Safety Analysis (PSA) results and industry operating experiences, the most critical component is identified. This paper proposed a systems engineering concept development framework towards EDG performance monitoring. The expected output of this study is that the EDG reliability can be improved by the performance monitoring alternatives through the systems engineering concept development effort. (author)

  7. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  8. Delineating species with DNA barcodes: a case of taxon dependent method performance in moths.

    Directory of Open Access Journals (Sweden)

    Mari Kekkonen

    Full Text Available The accelerating loss of biodiversity has created a need for more effective ways to discover species. Novel algorithmic approaches for analyzing sequence data combined with rapidly expanding DNA barcode libraries provide a potential solution. While several analytical methods are available for the delineation of operational taxonomic units (OTUs, few studies have compared their performance. This study compares the performance of one morphology-based and four DNA-based (BIN, parsimony networks, ABGD, GMYC methods on two groups of gelechioid moths. It examines 92 species of Finnish Gelechiinae and 103 species of Australian Elachistinae which were delineated by traditional taxonomy. The results reveal a striking difference in performance between the two taxa with all four DNA-based methods. OTU counts in the Elachistinae showed a wider range and a relatively low (ca. 65% OTU match with reference species while OTU counts were more congruent and performance was higher (ca. 90% in the Gelechiinae. Performance rose when only monophyletic species were compared, but the taxon-dependence remained. None of the DNA-based methods produced a correct match with non-monophyletic species, but singletons were handled well. A simulated test of morphospecies-grouping performed very poorly in revealing taxon diversity in these small, dull-colored moths. Despite the strong performance of analyses based on DNA barcodes, species delineated using single-locus mtDNA data are best viewed as OTUs that require validation by subsequent integrative taxonomic work.

  9. The method of educational assessment affects children's neural processing and performance: behavioural and fMRI Evidence

    Science.gov (United States)

    Howard, Steven J.; Burianová, Hana; Calleia, Alysha; Fynes-Clinton, Samuel; Kervin, Lisa; Bokosmaty, Sahar

    2017-08-01

    Standardised educational assessments are now widespread, yet their development has given comparatively more consideration to what to assess than how to optimally assess students' competencies. Existing evidence from behavioural studies with children and neuroscience studies with adults suggest that the method of assessment may affect neural processing and performance, but current evidence remains limited. To investigate the impact of assessment methods on neural processing and performance in young children, we used functional magnetic resonance imaging to identify and quantify the neural correlates during performance across a range of current approaches to standardised spelling assessment. Results indicated that children's test performance declined as the cognitive load of assessment method increased. Activation of neural nodes associated with working memory further suggests that this performance decline may be a consequence of a higher cognitive load, rather than the complexity of the content. These findings provide insights into principles of assessment (re)design, to ensure assessment results are an accurate reflection of students' true levels of competency.

  10. Ensemble approach combining multiple methods improves human transcription start site prediction

    LENUS (Irish Health Repository)

    Dineen, David G

    2010-11-30

    Abstract Background The computational prediction of transcription start sites is an important unsolved problem. Some recent progress has been made, but many promoters, particularly those not associated with CpG islands, are still difficult to locate using current methods. These methods use different features and training sets, along with a variety of machine learning techniques and result in different prediction sets. Results We demonstrate the heterogeneity of current prediction sets, and take advantage of this heterogeneity to construct a two-level classifier (\\'Profisi Ensemble\\') using predictions from 7 programs, along with 2 other data sources. Support vector machines using \\'full\\' and \\'reduced\\' data sets are combined in an either\\/or approach. We achieve a 14% increase in performance over the current state-of-the-art, as benchmarked by a third-party tool. Conclusions Supervised learning methods are a useful way to combine predictions from diverse sources.

  11. Performance evaluation and ranking of direct sales stores using BSC approach and fuzzy multiple attribute decision-making methods

    Directory of Open Access Journals (Sweden)

    Mojtaba Soltannezhad Dizaji

    2017-07-01

    Full Text Available In an environment where markets go through a volatile process, and rapid fundamental changes occur due to technological advances, it is important to ensure and maintain a good performance measurement. Organizations, in their performance evaluation, should consider different types of financial and non-financial indicators. In systems like direct sales stores in which decision units have multiple inputs and outputs, all criteria influencing on performance must be combined and examined in a system, simultaneously. The purpose of this study is to evaluate the performance of different products through direct sales of a firm named Shirin Asal with a combination of Balanced Scorecard, fuzzy AHP and TOPSIS so that the weaknesses of subjectivity and selective consideration of evaluators in evaluating the performance indicators are reduced and evaluation integration is provided by considering the contribution of each indicator and each indicator group of balanced scorecard. The research method of this case study is applied. The data collection method is a questionnaire from the previous studies, the use of experts' opinions and the study of documents in the organization. MATLAB and SPSS were used to analyze the data. During this study, the customer and financial perspectives are of the utmost importance to assess the company branches. Among the sub-criteria, the rate of new customer acquisition in the customer dimension and the net income to sales ratio in financial dimension are of the utmost importance.

  12. Comparison of two approaches for establishing performance criteria related to Maintenance Rule

    International Nuclear Information System (INIS)

    Jerng, Dong-Wook; Kim, Man Cheol

    2015-01-01

    Probabilistic safety assessment (PSA) serves as a tool for systemically analyzing the safety of nuclear power plants. This paper explains and compares two approaches for the establishment of performance criteria related to the Maintenance Rule: (1) the individual reliability-based approach, and (2) the PSA importance measure-based approach. Different characteristics of the two approaches were compared in a qualitative manner, while a quantitative comparison was performed through application of the two approaches to a nuclear power plant. It was observed that the individual reliability-based approach resulted in more conservative performance criteria, compared to the PSA importance measure-based approach. It is thus expected that the PSA importance measure-based approach will allow for more flexible maintenance policy under conditions of limited resources, while providing for a macroscopic view of overall plant safety. Based on insights derived through this analysis, we emphasize the importance of a balance between reliability and safety significance, and propose a balance measure accordingly. The conclusions of this analysis are likely to be applicable to other types of nuclear power plants. (author)

  13. Approach to the assessment of the performance of nondestructive test methods in the manufacture of nuclear power station equipment

    International Nuclear Information System (INIS)

    Michaut, J.P.

    1996-01-01

    The safety of a nuclear power station lies largely in the possibility of ensuring, at the time of in service inspections on major equipment, that the extent of faults which may appear or develop is not greater than that of faults detrimental to behavior in service. This assurance is based on performance demonstration of the nondestructive test methods used for inspecting the equipment in service. This is the subject of numerous studies in various countries. To ensure that manufacturing faults likely to downgrade the safety of the equipment are not discovered in service, it seems desirable to make sure that the performance of the nondestructive test (NDT) methods which are going to be used in manufacture will be at least as high as those used in service and that they are therefore capable of guaranteeing detection of faults clearly less important than really harmful faults. The performance of NDT methods and their consistency with those which can be used in service is evaluated before the start of manufacture on a mock-up representative of the equipment itself. Information is given on research in progress on the bimetal welding of a pressurizer spray nozzle

  14. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  15. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  16. Performance evaluation of 2D and 3D deep learning approaches for automatic segmentation of multiple organs on CT images

    Science.gov (United States)

    Zhou, Xiangrong; Yamada, Kazuma; Kojima, Takuya; Takayama, Ryosuke; Wang, Song; Zhou, Xinxin; Hara, Takeshi; Fujita, Hiroshi

    2018-02-01

    The purpose of this study is to evaluate and compare the performance of modern deep learning techniques for automatically recognizing and segmenting multiple organ regions on 3D CT images. CT image segmentation is one of the important task in medical image analysis and is still very challenging. Deep learning approaches have demonstrated the capability of scene recognition and semantic segmentation on nature images and have been used to address segmentation problems of medical images. Although several works showed promising results of CT image segmentation by using deep learning approaches, there is no comprehensive evaluation of segmentation performance of the deep learning on segmenting multiple organs on different portions of CT scans. In this paper, we evaluated and compared the segmentation performance of two different deep learning approaches that used 2D- and 3D deep convolutional neural networks (CNN) without- and with a pre-processing step. A conventional approach that presents the state-of-the-art performance of CT image segmentation without deep learning was also used for comparison. A dataset that includes 240 CT images scanned on different portions of human bodies was used for performance evaluation. The maximum number of 17 types of organ regions in each CT scan were segmented automatically and compared to the human annotations by using ratio of intersection over union (IU) as the criterion. The experimental results demonstrated the IUs of the segmentation results had a mean value of 79% and 67% by averaging 17 types of organs that segmented by a 3D- and 2D deep CNN, respectively. All the results of the deep learning approaches showed a better accuracy and robustness than the conventional segmentation method that used probabilistic atlas and graph-cut methods. The effectiveness and the usefulness of deep learning approaches were demonstrated for solving multiple organs segmentation problem on 3D CT images.

  17. Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice

    Science.gov (United States)

    Bolden, Marsha Gail

    Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.

  18. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael

    2017-01-01

    , functional forms, estimation methods and derived estimates of the degree of market power. Thereafter, we use our framework to evaluate several structural models based on PTA and GIM to measure oligopsony power in the Ukrainian dairy industry. The PTA-based results suggest that the estimated parameters......This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power...... in agricultural and food markets. We provide a framework that may help researchers to evaluate and improve structural models of market power. Starting with the specification of the approaches in question, we compare published empirical studies of market power with respect to the choice of the applied approach...

  19. A new approach for reliability analysis with time-variant performance characteristics

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2013-01-01

    Reliability represents safety level in industry practice and may variant due to time-variant operation condition and components deterioration throughout a product life-cycle. Thus, the capability to perform time-variant reliability analysis is of vital importance in practical engineering applications. This paper presents a new approach, referred to as nested extreme response surface (NERS), that can efficiently tackle time dependency issue in time-variant reliability analysis and enable to solve such problem by easily integrating with advanced time-independent tools. The key of the NERS approach is to build a nested response surface of time corresponding to the extreme value of the limit state function by employing Kriging model. To obtain the data for the Kriging model, the efficient global optimization technique is integrated with the NERS to extract the extreme time responses of the limit state function for any given system input. An adaptive response prediction and model maturation mechanism is developed based on mean square error (MSE) to concurrently improve the accuracy and computational efficiency of the proposed approach. With the nested response surface of time, the time-variant reliability analysis can be converted into the time-independent reliability analysis and existing advanced reliability analysis methods can be used. Three case studies are used to demonstrate the efficiency and accuracy of NERS approach

  20. Enhanced Portfolio Performance Using a Momentum Approach to Annual Rebalancing

    OpenAIRE

    Michael D. Mattei

    2018-01-01

    After diversification, periodic portfolio rebalancing has become one of the most widely practiced methods for reducing portfolio risk and enhancing returns. Most of the rebalancing strategies found in the literature are generally regarded as contrarian approaches to rebalancing. A recent article proposed a rebalancing approach that incorporates a momentum approach to rebalancing. The momentum approach had a better risk adjusted return than either the traditional approach or a Buy-and-Hold app...

  1. An approach to build knowledge base for reactor accident diagnostic system using statistical method

    International Nuclear Information System (INIS)

    Kohsaka, Atsuo; Yokobayashi, Masao; Matsumoto, Kiyoshi; Fujii, Minoru

    1988-01-01

    In the development of a rule based expert system, one of key issues is how to build a knowledge base (KB). A systematic approach has been attempted for building an objective KB efficiently. The approach is based on the concept that a prototype KB should first be generated in a systematic way and then it is to be modified and/or improved by expert for practical use. The statistical method, Factor Analysis, was applied to build a prototype KB for the JAERI expert system DISKET using source information obtained from a PWR simulator. The prototype KB was obtained and the inference with this KB was performed against several types of transients. In each diagnosis, the transient type was well identified. From this study, it is concluded that the statistical method used is useful for building a prototype knowledge base. (author)

  2. Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study.

    Science.gov (United States)

    Ilic, Dragan; Hart, William; Fiddes, Patrick; Misso, Marie; Villanueva, Elmer

    2013-12-17

    Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM. A mixed-methods study was conducted consisting of a controlled trial and focus groups with second year graduate medical students. Students received the EBM course delivered using either a didactic approach (DID) to learning EBM or a blended-learning approach (BL). Student competency in EBM was assessed using the Berlin tool and a criterion-based assessment task, with student perceptions on the interventions assessed qualitatively. A total of 61 students (85.9%) participated in the study. Competency in EBM did not differ between the groups when assessed using the Berlin tool (p = 0.29). Students using the BL approach performed significantly better in one of the criterion-based assessment tasks (p = 0.01) and reported significantly higher self-perceived competence in critical appraisal skills. Qualitative analysis identified that students had a preference for the EBM course to be delivered using the BL approach. Implementing a blended-learning approach to EBM teaching promotes greater student appreciation of EBM principles within the clinical setting. Integrating a variety of teaching modalities and approaches can increase student self-confidence and assist in bridging the gap between the theory and practice of EBM.

  3. Three-dimensional vision enhances task performance independently of the surgical method.

    Science.gov (United States)

    Wagner, O J; Hagen, M; Kurmann, A; Horgan, S; Candinas, D; Vorburger, S A

    2012-10-01

    Within the next few years, the medical industry will launch increasingly affordable three-dimensional (3D) vision systems for the operating room (OR). This study aimed to evaluate the effect of two-dimensional (2D) and 3D visualization on surgical skills and task performance. In this study, 34 individuals with varying laparoscopic experience (18 inexperienced individuals) performed three tasks to test spatial relationships, grasping and positioning, dexterity, precision, and hand-eye and hand-hand coordination. Each task was performed in 3D using binocular vision for open performance, the Viking 3Di Vision System for laparoscopic performance, and the DaVinci robotic system. The same tasks were repeated in 2D using an eye patch for monocular vision, conventional laparoscopy, and the DaVinci robotic system. Loss of 3D vision significantly increased the perceived difficulty of a task and the time required to perform it, independently of the approach (P robot than with laparoscopy (P = 0.005). In every case, 3D robotic performance was superior to conventional laparoscopy (2D) (P < 0.001-0.015). The more complex the task, the more 3D vision accelerates task completion compared with 2D vision. The gain in task performance is independent of the surgical method.

  4. A combined volume-of-fluid method and low-Mach-number approach for DNS of evaporating droplets in turbulence

    Science.gov (United States)

    Dodd, Michael; Ferrante, Antonino

    2017-11-01

    Our objective is to perform DNS of finite-size droplets that are evaporating in isotropic turbulence. This requires fully resolving the process of momentum, heat, and mass transfer between the droplets and surrounding gas. We developed a combined volume-of-fluid (VOF) method and low-Mach-number approach to simulate this flow. The two main novelties of the method are: (i) the VOF algorithm captures the motion of the liquid gas interface in the presence of mass transfer due to evaporation and condensation without requiring a projection step for the liquid velocity, and (ii) the low-Mach-number approach allows for local volume changes caused by phase change while the total volume of the liquid-gas system is constant. The method is verified against an analytical solution for a Stefan flow problem, and the D2 law is verified for a single droplet in quiescent gas. We also demonstrate the schemes robustness when performing DNS of an evaporating droplet in forced isotropic turbulence.

  5. Ab initio molecular dynamics, iterative methods and multiscale approaches in electronic structure calculations

    International Nuclear Information System (INIS)

    Bernholc, J.

    1998-01-01

    The field of computational materials physics has grown very quickly in the past decade, and it is now possible to simulate properties of complex materials completely from first principles. The presentation has mostly focused on first-principles dynamic simulations. Such simulations have been pioneered by Car and Parrinello, who introduced a method for performing realistic simulations within the context of density functional theory. The Car-Parrinello method and related plane wave approaches are reviewed in depth. The Car-Parrinello method was reviewed and illustrated with several applications: the dynamics of the C 60 solid, diffusion across Si steps, and computing free energy differences. Alternative ab initio simulation schemes, which use preconditioned conjugate gradient techniques for energy minimization and dynamics were also discussed

  6. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo; David I Gertman

    2012-06-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handling of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.

  7. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    International Nuclear Information System (INIS)

    Hugo, Jacques; Gertman, David I.

    2012-01-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handling of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.

  8. A neuroanatomical approach to exploring organizational performance

    Directory of Open Access Journals (Sweden)

    Gillingwater, D.

    2009-01-01

    Full Text Available Insights gained from studying the human brain have begun to open up promising new areas of research in the behavioural and social sciences. Neuroscience-based principles have been incorporated into areas such as business management, economics and marketing, leading to the development of artificial neural networks, neuroeconomics, neuromarketing and, most recently, organizational cognitive neuroscience. Similarly, the brain has been used as a powerful metaphor for thinking about and analysing the nature of organizations. However, no existing approach to organizational analysis has taken advantage of contemporary neuroanatomical principles, thereby missing the opportunity to translate core neuroanatomical knowledge into other, non-related areas of research. In this essentially conceptual paper, we propose several ways in which neuroanatomical approaches could be used to enhance organizational theory, practice and research. We suggest that truly interdisciplinary and collaborative research between neuroanatomists and organizational analysts is likely to provide novel approaches to exploring and improving organizational performance.

  9. Reliability assessment of serviceability performance of braced retaining walls using a neural network approach

    Science.gov (United States)

    Goh, A. T. C.; Kulhawy, F. H.

    2005-05-01

    In urban environments, one major concern with deep excavations in soft clay is the potentially large ground deformations in and around the excavation. Excessive movements can damage adjacent buildings and utilities. There are many uncertainties associated with the calculation of the ultimate or serviceability performance of a braced excavation system. These include the variabilities of the loadings, geotechnical soil properties, and engineering and geometrical properties of the wall. A risk-based approach to serviceability performance failure is necessary to incorporate systematically the uncertainties associated with the various design parameters. This paper demonstrates the use of an integrated neural network-reliability method to assess the risk of serviceability failure through the calculation of the reliability index. By first performing a series of parametric studies using the finite element method and then approximating the non-linear limit state surface (the boundary separating the safe and failure domains) through a neural network model, the reliability index can be determined with the aid of a spreadsheet. Two illustrative examples are presented to show how the serviceability performance for braced excavation problems can be assessed using the reliability index.

  10. Systemic Approach to Architectural Performance

    Directory of Open Access Journals (Sweden)

    Marie Davidova

    2017-04-01

    Full Text Available First-hand experiences in several design projects that were based on media richness and collaboration are described in this article. Although complex design processes are merely considered as socio-technical systems, they are deeply involved with natural systems. My collaborative research in the field of performance-oriented design combines digital and physical conceptual sketches, simulations and prototyping. GIGA-mapping - is applied to organise the data. The design process uses the most suitable tools, for the subtasks at hand, and the use of media is mixed according to particular requirements. These tools include digital and physical GIGA-mapping, parametric computer aided design (CAD, digital simulation of analyses, as well as sampling and 1:1 prototyping. Also discussed in this article are the methodologies used in several design projects to strategize these tools and the developments and trends in the tools employed.  The paper argues that the digital tools tend to produce similar results through given pre-sets that often do not correspond to real needs. Thus, there is a significant need for mixed methods including prototyping in the creative design process. Media mixing and cooperation across disciplines is unavoidable in the holistic approach to contemporary design. This includes the consideration of diverse biotic and abiotic agents. I argue that physical and digital GIGA-mapping is a crucial tool to use in coping with this complexity. Furthermore, I propose the integration of physical and digital outputs in one GIGA-map and the participation and co-design of biotic and abiotic agents into one rich design research space, which is resulting in an ever-evolving research-design process-result time-based design.

  11. Traction cytometry: regularization in the Fourier approach and comparisons with finite element method.

    Science.gov (United States)

    Kulkarni, Ankur H; Ghosh, Prasenjit; Seetharaman, Ashwin; Kondaiah, Paturu; Gundiah, Namrata

    2018-05-09

    Traction forces exerted by adherent cells are quantified using displacements of embedded markers on polyacrylamide substrates due to cell contractility. Fourier Transform Traction Cytometry (FTTC) is widely used to calculate tractions but has inherent limitations due to errors in the displacement fields; these are mitigated through a regularization parameter (γ) in the Reg-FTTC method. An alternate finite element (FE) approach computes tractions on a domain using known boundary conditions. Robust verification and recovery studies are lacking but essential in assessing the accuracy and noise sensitivity of the traction solutions from the different methods. We implemented the L2 regularization method and defined a maximum curvature point in the traction with γ plot as the optimal regularization parameter (γ*) in the Reg-FTTC approach. Traction reconstructions using γ* yield accurate values of low and maximum tractions (Tmax) in the presence of up to 5% noise. Reg-FTTC is hence a clear improvement over the FTTC method but is inadequate to reconstruct low stresses such as those at nascent focal adhesions. FE, implemented using a node-by-node comparison, showed an intermediate reconstruction compared to Reg-FTTC. We performed experiments using mouse embryonic fibroblast (MEF) and compared results between these approaches. Tractions from FTTC and FE showed differences of ∼92% and 22% as compared to Reg-FTTC. Selection of an optimum value of γ for each cell reduced variability in the computed tractions as compared to using a single value of γ for all the MEF cells in this study.

  12. Solution of the neutron point kinetics equations with temperature feedback effects applying the polynomial approach method

    International Nuclear Information System (INIS)

    Tumelero, Fernanda; Petersen, Claudio Z.; Goncalves, Glenio A.; Lazzari, Luana

    2015-01-01

    In this work, we present a solution of the Neutron Point Kinetics Equations with temperature feedback effects applying the Polynomial Approach Method. For the solution, we consider one and six groups of delayed neutrons precursors with temperature feedback effects and constant reactivity. The main idea is to expand the neutron density, delayed neutron precursors and temperature as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions of the problem and the analytical continuation is used to determine the solutions of the next intervals. With the application of the Polynomial Approximation Method it is possible to overcome the stiffness problem of the equations. In such a way, one varies the time step size of the Polynomial Approach Method and performs an analysis about the precision and computational time. Moreover, we compare the method with different types of approaches (linear, quadratic and cubic) of the power series. The answer of neutron density and temperature obtained by numerical simulations with linear approximation are compared with results in the literature. (author)

  13. Solution of the neutron point kinetics equations with temperature feedback effects applying the polynomial approach method

    Energy Technology Data Exchange (ETDEWEB)

    Tumelero, Fernanda, E-mail: fernanda.tumelero@yahoo.com.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Petersen, Claudio Z.; Goncalves, Glenio A.; Lazzari, Luana, E-mail: claudiopeteren@yahoo.com.br, E-mail: gleniogoncalves@yahoo.com.br, E-mail: luana-lazzari@hotmail.com [Universidade Federal de Pelotas (DME/UFPEL), Capao do Leao, RS (Brazil). Instituto de Fisica e Matematica

    2015-07-01

    In this work, we present a solution of the Neutron Point Kinetics Equations with temperature feedback effects applying the Polynomial Approach Method. For the solution, we consider one and six groups of delayed neutrons precursors with temperature feedback effects and constant reactivity. The main idea is to expand the neutron density, delayed neutron precursors and temperature as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions of the problem and the analytical continuation is used to determine the solutions of the next intervals. With the application of the Polynomial Approximation Method it is possible to overcome the stiffness problem of the equations. In such a way, one varies the time step size of the Polynomial Approach Method and performs an analysis about the precision and computational time. Moreover, we compare the method with different types of approaches (linear, quadratic and cubic) of the power series. The answer of neutron density and temperature obtained by numerical simulations with linear approximation are compared with results in the literature. (author)

  14. Continual reassessment method for dose escalation clinical trials in oncology: a comparison of prior skeleton approaches using AZD3514 data.

    Science.gov (United States)

    James, Gareth D; Symeonides, Stefan N; Marshall, Jayne; Young, Julia; Clack, Glen

    2016-08-31

    The continual reassessment method (CRM) requires an underlying model of the dose-toxicity relationship ("prior skeleton") and there is limited guidance of what this should be when little is known about this association. In this manuscript the impact of applying the CRM with different prior skeleton approaches and the 3 + 3 method are compared in terms of ability to determine the true maximum tolerated dose (MTD) and number of patients allocated to sub-optimal and toxic doses. Post-hoc dose-escalation analyses on real-life clinical trial data on an early oncology compound (AZD3514), using the 3 + 3 method and CRM using six different prior skeleton approaches. All methods correctly identified the true MTD. The 3 + 3 method allocated six patients to both sub-optimal and toxic doses. All CRM approaches allocated four patients to sub-optimal doses. No patients were allocated to toxic doses from sigmoidal, two from conservative and five from other approaches. Prior skeletons for the CRM for phase 1 clinical trials are proposed in this manuscript and applied to a real clinical trial dataset. Highly accurate initial skeleton estimates may not be essential to determine the true MTD, and, as expected, all CRM methods out-performed the 3 + 3 method. There were differences in performance between skeletons. The choice of skeleton should depend on whether minimizing the number of patients allocated to suboptimal or toxic doses is more important. NCT01162395 , Trial date of first registration: July 13, 2010.

  15. Development of a high performance liquid chromatography method ...

    African Journals Online (AJOL)

    Development of a high performance liquid chromatography method for simultaneous ... Purpose: To develop and validate a new low-cost high performance liquid chromatography (HPLC) method for ..... Several papers have reported the use of ...

  16. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    International Nuclear Information System (INIS)

    Kim, Je Hyun; Shim, Chang Ho; Kim, Sung Hyun; Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo; Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho

    2016-01-01

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers

  17. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Je Hyun; Shim, Chang Ho [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of); Kim, Sung Hyun [Nuclear Fuel Cycle Waste Treatment Research Division, Research Reactor Institute, Kyoto University, Osaka (Japan); Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo [Ionizing Radiation Center, Nuclear Fuel Cycle Waste Treatment Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho [Ionizing Radiation Center, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2016-12-15

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers.

  18. An approach to a black carbon emission inventory for Mexico by two methods

    International Nuclear Information System (INIS)

    Cruz-Núñez, Xochitl

    2014-01-01

    A black carbon (BC) emission inventory for Mexico is presented. Estimate was performed by using two approaches, based on fuel consumption and emission factors in a top-down scheme, and the second from PM25 emission data and its correlation with black carbon by source category, assuming that black carbon = elemental carbon. Results show that black carbon emissions are in interval 53–473 Gg using the fuel consumption approach and between 62 and 89 using the sector method. Black carbon key sources come from biomass burning in the rural sector, with 47 percent share to the National total. Mobile sources emissions account to 16% to the total. An opportunity to reduce, in the short-term, carbon dioxide equivalent (CO2-eq) emissions by reducing black carbon emissions would be obtained in reducing emissions mainly from biomass burning in rural housing sector and diesel emissions in the transport sector with important co-benefits in direct radiative forcing, public health and air quality. - Highlights: • Black carbon emissions are estimated between 53 and 473 Gg/year on a fuel consumption method. • Black carbon emissions are estimated between 62 and 89 Gg/year on a sector method

  19. An approach to a black carbon emission inventory for Mexico by two methods

    Energy Technology Data Exchange (ETDEWEB)

    Cruz-Núñez, Xochitl, E-mail: xcruz@unam.mx

    2014-05-01

    A black carbon (BC) emission inventory for Mexico is presented. Estimate was performed by using two approaches, based on fuel consumption and emission factors in a top-down scheme, and the second from PM25 emission data and its correlation with black carbon by source category, assuming that black carbon = elemental carbon. Results show that black carbon emissions are in interval 53–473 Gg using the fuel consumption approach and between 62 and 89 using the sector method. Black carbon key sources come from biomass burning in the rural sector, with 47 percent share to the National total. Mobile sources emissions account to 16% to the total. An opportunity to reduce, in the short-term, carbon dioxide equivalent (CO2-eq) emissions by reducing black carbon emissions would be obtained in reducing emissions mainly from biomass burning in rural housing sector and diesel emissions in the transport sector with important co-benefits in direct radiative forcing, public health and air quality. - Highlights: • Black carbon emissions are estimated between 53 and 473 Gg/year on a fuel consumption method. • Black carbon emissions are estimated between 62 and 89 Gg/year on a sector method.

  20. Advanced nuclear power plant regulation using risk-informed and performance-based methods

    International Nuclear Information System (INIS)

    Modarres, Mohammad

    2009-01-01

    This paper proposes and discusses implications of a largely probabilistic regulatory framework using best-estimate, goal-driven, risk-informed, and performance-based methods. This framework relies on continuous probabilistic assessment of performance of a set of time-dependent, safety-critical systems, structures, components, and procedures that assure attainment of a broad set of overarching technology-neutral protective, mitigative, and preventive goals under all phases of plant operations. In this framework acceptable levels of performance are set through formal apportionment so that they are commensurate with the overarching goals. Regulatory acceptance would be the based on the confidence level with which the plant conforms to these goals and performance objectives. The proposed framework uses the traditional defense-in-depth design and operation regulatory philosophy when uncertainty in conforming to specific goals and objectives is high. Finally, the paper discusses the steps needed to develop a corresponding technology-neutral regulatory approach from the proposed framework

  1. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    2016-09-03

    Sep 3, 2016 ... Sequential mixed methods research is an effective approach for ... show the effectiveness of the research method. ... qualitative data before quantitative datasets ..... whereby both types of data are collected simultaneously.

  2. Assessing the stability of free-energy perturbation calculations by performing variations in the method

    Science.gov (United States)

    Manzoni, Francesco; Ryde, Ulf

    2018-03-01

    We have calculated relative binding affinities for eight tetrafluorophenyl-triazole-thiogalactoside inhibitors of galectin-3 with the alchemical free-energy perturbation approach. We obtain a mean absolute deviation from experimental estimates of only 2-3 kJ/mol and a correlation coefficient (R 2) of 0.5-0.8 for seven relative affinities spanning a range of up to 11 kJ/mol. We also studied the effect of using different methods to calculate the charges of the inhibitor and different sizes of the perturbed group (the atoms that are described by soft-core potentials and are allowed to have differing coordinates). However, the various approaches gave rather similar results and it is not possible to point out one approach as consistently and significantly better than the others. Instead, we suggest that such small and reasonable variations in the computational method can be used to check how stable the calculated results are and to obtain a more accurate estimate of the uncertainty than if performing only one calculation with a single computational setup.

  3. Model Multi Criteria Decision Making with Fuzzy ANP Method for Performance Measurement Small Medium Enterprise (SME)

    Science.gov (United States)

    Rahmanita, E.; Widyaningrum, V. T.; Kustiyahningsih, Y.; Purnama, J.

    2018-04-01

    SMEs have a very important role in the development of the economy in Indonesia. SMEs assist the government in terms of creating new jobs and can support household income. The number of SMEs in Madura and the number of measurement indicators in the SME mapping so that it requires a method.This research uses Fuzzy Analytic Network Process (FANP) method for performance measurement SME. The FANP method can handle data that contains uncertainty. There is consistency index in determining decisions. Performance measurement in this study is based on a perspective of the Balanced Scorecard. This research approach integrated internal business perspective, learning, and growth perspective and fuzzy Analytic Network Process (FANP). The results of this research areframework a priority weighting of assessment indicators SME.

  4. A simplified method for evaluating thermal performance of unglazed transpired solar collectors under steady state

    International Nuclear Information System (INIS)

    Wang, Xiaoliang; Lei, Bo; Bi, Haiquan; Yu, Tao

    2017-01-01

    Highlights: • A simplified method for evaluating thermal performance of UTC is developed. • Experiments, numerical simulations, dimensional analysis and data fitting are used. • The correlation of absorber plate temperature for UTC is established. • The empirical correlation of heat exchange effectiveness for UTC is proposed. - Abstract: Due to the advantages of low investment and high energy efficiency, unglazed transpired solar collectors (UTC) have been widely used for heating in buildings. However, it is difficult for designers to quickly evaluate the thermal performance of UTC based on the conventional methods such as experiments and numerical simulations. Therefore, a simple and fast method to determine the thermal performance of UTC is indispensable. The objective of this work is to provide a simplified calculation method to easily evaluate the thermal performance of UTC under steady state. Different parameters are considered in the simplified method, including pitch, perforation diameter, solar radiation, solar absorptivity, approach velocity, ambient air temperature, absorber plate temperature, and so on. Based on existing design parameters and operating conditions, correlations for the absorber plate temperature and the heat exchange effectiveness are developed using dimensional analysis and data fitting, respectively. Results show that the proposed simplified method has a high accuracy and can be employed to evaluate the collector efficiency, the heat exchange effectiveness and the air temperature rise. The proposed method in this paper is beneficial to directly determine design parameters and operating status for UTC.

  5. Effects of Asphalt Mix Design Properties on Pavement Performance: A Mechanistic Approach

    Directory of Open Access Journals (Sweden)

    Ahmad M. Abu Abdo

    2016-01-01

    Full Text Available The main objective of this study was to investigate the effects of hot mix asphalt material properties on the performance of flexible pavements via mechanistic approach. 3D Move Analysis software was utilized to determine rutting and cracking distresses in an asphalt concrete (AC layer. Fourteen different Superpave mixes were evaluated by utilizing results of the Dynamic Modulus (|E⁎| Test and the Dynamic Shear Modulus (|G⁎| Test. Results showed that with the increase of binder content, the tendency of rutting in AC layer increased. However, with the increase of binder content, the cracking of AC layer lessened. Furthermore, when different binder grades were evaluated, results showed that with the increase of the upper binder grade number, rutting decreased, and with the increase of the lower binder grade number, rutting increased. Furthermore, analysis showed that with the increase of the lower binder grade number, higher percent of bottom up cracks would result. As a result of the analysis, binder grade should not be solely considered for cracking in AC layer; binder content and aggregate structure play a big role. Finally, results illustrated that the mechanistic approach is a better tool to determine the performance of asphalt pavement than commonly used methods.

  6. Performance Optimization in Sport: A Psychophysiological Approach

    Directory of Open Access Journals (Sweden)

    Selenia di Fronso

    2017-11-01

    Full Text Available ABSTRACT In the last 20 years, there was a growing interest in the study of the theoretical and applied issues surrounding psychophysiological processes underlying performance. The psychophysiological monitoring, which enables the study of these processes, consists of the assessment of the activation and functioning level of the organism using a multidimensional approach. In sport, it can be used to attain a better understanding of the processes underlying athletic performance and to improve it. The most frequently used ecological techniques include electromyography (EMG, electrocardiography (ECG, electroencephalography (EEG, and the assessment of electrodermal activity and breathing rhythm. The purpose of this paper is to offer an overview of the use of these techniques in applied interventions in sport and physical exercise and to give athletes, coaches and sport psychology experts new insights for performance improvement.

  7. DESIGNING COMPANY PERFORMANCE MEASUREMENT SYSTEM USING BALANCE SCORECARD APPROACH

    Directory of Open Access Journals (Sweden)

    Cecep Mukti Soleh

    2015-05-01

    Full Text Available This research aimed to design how to measure company performance by using balance scorecard approach in coal transportation services industry. Depth interview was used to obtain qualitative data determination of strategic objectives, key performance indicators, strategic initiatives, and in charge units for each balanced scorecard perspectives while the quantitative data were obtained from weighting through questionnaires and analyzed using paired comparison to get a perspective what mostly affected the performance of the company. To measure the achievement of corporate performance, each KPI used (1 the scoring system with the methods that higher is better, lower is better and precise is better; (2 traffic light system with the help of green, yellow, red for identification of target achievement. This research result shows that in the balance scorecard perspective, the most influences on the overall performance of the company include the customer's perspective (31%, financial perspective (29%, internal business processes (21%, learning, and growth 19%. Keywords: balance scorecard, paired comparison, coal transportation serviceABSTRAKPenelitian ini bertujuan untuk merancang pengukuran kinerja perusahaan dengan menggunakan pendekatan balance scorecard di industri jasa pengangkutan batu bara. Data kualitatif diperoleh melalui indepth interview digunakan untuk menentukan sasaran strategik, indikator kinerja utama, inisiatif strategi dan penanggungjawab setiap divisi setiap perspektif balance scorecard, sedangkan data kuantitatif digunakan untuk pembobotan melalui kuesioner dan dianalisis dengan menggunakan metode paired comparisson untuk mendapatkan perspektif yang paling berpengaruh terhadap kinerja perusahaan. Ukuran pencapaian kinerja perusahaan dari setiap KPI menggunakan; (1 scoring system dengan bantuan metode higher is better, lower is better dan precise is better;(2 traffic light system dengan menggunakan bantuan warna hijau, kuning, merah

  8. An enhanced performance through agent-based secure approach for mobile ad hoc networks

    Science.gov (United States)

    Bisen, Dhananjay; Sharma, Sanjeev

    2018-01-01

    This paper proposes an agent-based secure enhanced performance approach (AB-SEP) for mobile ad hoc network. In this approach, agent nodes are selected through optimal node reliability as a factor. This factor is calculated on the basis of node performance features such as degree difference, normalised distance value, energy level, mobility and optimal hello interval of node. After selection of agent nodes, a procedure of malicious behaviour detection is performed using fuzzy-based secure architecture (FBSA). To evaluate the performance of the proposed approach, comparative analysis is done with conventional schemes using performance parameters such as packet delivery ratio, throughput, total packet forwarding, network overhead, end-to-end delay and percentage of malicious detection.

  9. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Naser Gholi Sarli

    2013-10-01

    Full Text Available Abstract One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year organic patterns of evolution great poets and writers literary emblems and evaluations of every period events, concepts and periods of general or political history analogy of literary history and history of ideas or history of arts approaches and styles of language dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change. Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  10. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Dr. N. Gh. Sarli

    Full Text Available One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year; organic patterns of evolution; great poets and writers; literary emblems and evaluations of every period; events, concepts and periods of general or political history; analogy of literary history and history of ideas or history of arts; approaches and styles of language; dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change.Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  11. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Naser Gholi Sarli

    2013-11-01

    Full Text Available Abstract One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year organic patterns of evolution great poets and writers literary emblems and evaluations of every period events, concepts and periods of general or political history analogy of literary history and history of ideas or history of arts approaches and styles of language dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change. Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  12. Iterative approach as alternative to S-matrix in modal methods

    Science.gov (United States)

    Semenikhin, Igor; Zanuccoli, Mauro

    2014-12-01

    The continuously increasing complexity of opto-electronic devices and the rising demands of simulation accuracy lead to the need of solving very large systems of linear equations making iterative methods promising and attractive from the computational point of view with respect to direct methods. In particular, iterative approach potentially enables the reduction of required computational time to solve Maxwell's equations by Eigenmode Expansion algorithms. Regardless of the particular eigenmodes finding method used, the expansion coefficients are computed as a rule by scattering matrix (S-matrix) approach or similar techniques requiring order of M3 operations. In this work we consider alternatives to the S-matrix technique which are based on pure iterative or mixed direct-iterative approaches. The possibility to diminish the impact of M3 -order calculations to overall time and in some cases even to reduce the number of arithmetic operations to M2 by applying iterative techniques are discussed. Numerical results are illustrated to discuss validity and potentiality of the proposed approaches.

  13. Alternative test method to assess the energy performance of frost-free refrigerating appliances

    International Nuclear Information System (INIS)

    Hermes, Christian J.L.; Melo, Cláudio; Knabben, Fernando T.

    2013-01-01

    This paper outlines an alternative test method to evaluate the energy consumption of frost-free refrigerators and freezers for residential applications. While the standardized methods require the refrigerating appliance to be kept running according to its onboard control system, which usually drives the refrigerator through an on–off cycling pattern, the proposed approach assesses the refrigerator energy performance in the steady-state regime, being therefore much faster and more reliable. In this procedure, the cooling capacity is matched to the cooling loads by PID-controlled electrical heaters installed within the refrigerated compartments, so that the compartment temperatures are kept at the desired standardized levels. Comparisons between the experimental results obtained using the steady-state energy test and the standardized procedures showed that the former follows closely the trends observed for the latter. - Highlights: ► An alternative test method to assess the energy consumption of refrigerators is proposed. ► PID-controlled electrical heaters were installed within the compartments. ► Steady-state and ISO energy tests were performed and compared. ► Both proposed and standardized test procedures showed similar trends.

  14. Quality by design approach in the development of an ultra-high-performance liquid chromatography method for Bexsero meningococcal group B vaccine.

    Science.gov (United States)

    Nompari, Luca; Orlandini, Serena; Pasquini, Benedetta; Campa, Cristiana; Rovini, Michele; Del Bubba, Massimo; Furlanetto, Sandra

    2018-02-01

    Bexsero is the first approved vaccine for active immunization of individuals from 2 months of age and older to prevent invasive disease caused by Neisseria meningitidis serogroup B. The active components of the vaccine are Neisseria Heparin Binding Antigen, factor H binding protein, Neisseria adhesin A, produced in Escherichia coli cells by recombinant DNA technology, and Outer Membrane Vesicles (expressing Porin A and Porin B), produced by fermentation of Neisseria meningitidis strain NZ98/254. All the Bexsero active components are adsorbed on aluminum hydroxide and the unadsorbed antigens content is a product critical quality attribute. In this paper the development of a fast, selective and sensitive ultra-high-performance liquid chromatography (UHPLC) method for the determination of the Bexsero antigens in the vaccine supernatant is presented. For the first time in the literature, the Quality by Design (QbD) principles were applied to the development of an analytical method aimed to the quality control of a vaccine product. The UHPLC method was fully developed within the QbD framework, the new paradigm of quality outlined in International Conference on Harmonisation guidelines. Critical method attributes (CMAs) were identified with the capacity factor of Neisseria Heparin Binding Antigen, antigens resolution and peak areas. After a scouting phase, aimed at selecting a suitable and fast UHPLC operative mode for the vaccine antigens separation, risk assessment tools were employed to define the critical method parameters to be considered in the screening phase. Screening designs were applied for investigating at first the effects of vial type and sample concentration, and then the effects of injection volume, column type, organic phase starting concentration, ramp time and temperature. Response Surface Methodology pointed out the presence of several significant interaction effects, and with the support of Monte-Carlo simulations led to map out the design space, at

  15. A new approach to developing and optimizing organization strategy based on stochastic quantitative model of strategic performance

    Directory of Open Access Journals (Sweden)

    Marko Hell

    2014-03-01

    Full Text Available This paper presents a highly formalized approach to strategy formulation and optimization of strategic performance through proper resource allocation. A stochastic quantitative model of strategic performance (SQMSP is used to evaluate the efficiency of the strategy developed. The SQMSP follows the theoretical notions of the balanced scorecard (BSC and strategy map methodologies, initially developed by Kaplan and Norton. Parameters of the SQMSP are suggested to be random variables and be evaluated by experts who give two-point (optimistic and pessimistic values and three-point (optimistic, most probable and pessimistic values evaluations. The Monte-Carlo method is used to simulate strategic performance. Having been implemented within a computer application and applied to solve the real problem (planning of an IT-strategy at the Faculty of Economics, University of Split the proposed approach demonstrated its high potential as a basis for development of decision support tools related to strategic planning.

  16. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Directory of Open Access Journals (Sweden)

    Ozlem Senvar

    2016-08-01

    Full Text Available Purpose: This study examines Clements’ Approach (CA, Box-Cox transformation (BCT, and Johnson transformation (JT methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI Ppu is handled for process capability analysis (PCA because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD, which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB and the Relative Root Mean Square Error (RRMSE are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations. Practical implications: Weibull distribution covers a wide class of non-normal processes due to its capability to yield a variety of distinct curves based on its parameters. Weibull distributions are known to have significantly different tail behaviors, which greatly affects the process capability. In quality and reliability applications, they are widely used for the analyses of failure data in order to understand how

  17. Linear Motion Systems. A Modular Approach for Improved Straightness Performance

    NARCIS (Netherlands)

    Nijsse, G.J.P.

    2001-01-01

    This thesis deals with straight motion systems. A modular approach has been applied in order to find ways to improve the performance. The main performance parameters that are considered are position accuracy, repeatability and, to a lesser extent, cost. Because of the increasing requirements to

  18. A Dynamic Fuzzy Approach Based on the EDAS Method for Multi-Criteria Subcontractor Evaluation

    Directory of Open Access Journals (Sweden)

    Mehdi Keshavarz-Ghorabaee

    2018-03-01

    Full Text Available Selection of appropriate subcontractors for outsourcing is very important for the success of construction projects. This can improve the overall quality of projects and promote the qualification and reputation of the main contractors. The evaluation of subcontractors can be made by some experts or decision-makers with respect to some criteria. If this process is done in different time periods, it can be defined as a dynamic multi-criteria group decision-making (MCGDM problem. In this study, we propose a new fuzzy dynamic MCGDM approach based on the EDAS (Evaluation based on Distance from Average Solution method for subcontractor evaluation. In the procedure of the proposed approach, the sets of alternatives, criteria and decision-makers can be changed at different time periods. Also, the proposed approach gives more weight to newer decision information for aggregating the overall performance of alternatives. A numerical example is used to illustrate the proposed approach and show the application of it in subcontractor evaluation. The results demonstrate that the proposed approach is efficient and useful in real-world decision-making problems.

  19. Why Did the Bear Cross the Road? Comparing the Performance of Multiple Resistance Surfaces and Connectivity Modeling Methods

    Directory of Open Access Journals (Sweden)

    Samuel A. Cushman

    2014-12-01

    Full Text Available There have been few assessments of the performance of alternative resistance surfaces, and little is known about how connectivity modeling approaches differ in their ability to predict organism movements. In this paper, we evaluate the performance of four connectivity modeling approaches applied to two resistance surfaces in predicting the locations of highway crossings by American black bears in the northern Rocky Mountains, USA. We found that a resistance surface derived directly from movement data greatly outperformed a resistance surface produced from analysis of genetic differentiation, despite their heuristic similarities. Our analysis also suggested differences in the performance of different connectivity modeling approaches. Factorial least cost paths appeared to slightly outperform other methods on the movement-derived resistance surface, but had very poor performance on the resistance surface obtained from multi-model landscape genetic analysis. Cumulative resistant kernels appeared to offer the best combination of high predictive performance and sensitivity to differences in resistance surface parameterization. Our analysis highlights that even when two resistance surfaces include the same variables and have a high spatial correlation of resistance values, they may perform very differently in predicting animal movement and population connectivity.

  20. Effect of the sequence data deluge on the performance of methods for detecting protein functional residues.

    Science.gov (United States)

    Garrido-Martín, Diego; Pazos, Florencio

    2018-02-27

    The exponential accumulation of new sequences in public databases is expected to improve the performance of all the approaches for predicting protein structural and functional features. Nevertheless, this was never assessed or quantified for some widely used methodologies, such as those aimed at detecting functional sites and functional subfamilies in protein multiple sequence alignments. Using raw protein sequences as only input, these approaches can detect fully conserved positions, as well as those with a family-dependent conservation pattern. Both types of residues are routinely used as predictors of functional sites and, consequently, understanding how the sequence content of the databases affects them is relevant and timely. In this work we evaluate how the growth and change with time in the content of sequence databases affect five sequence-based approaches for detecting functional sites and subfamilies. We do that by recreating historical versions of the multiple sequence alignments that would have been obtained in the past based on the database contents at different time points, covering a period of 20 years. Applying the methods to these historical alignments allows quantifying the temporal variation in their performance. Our results show that the number of families to which these methods can be applied sharply increases with time, while their ability to detect potentially functional residues remains almost constant. These results are informative for the methods' developers and final users, and may have implications in the design of new sequencing initiatives.

  1. Cache and memory hierarchy design a performance directed approach

    CERN Document Server

    Przybylski, Steven A

    1991-01-01

    An authoritative book for hardware and software designers. Caches are by far the simplest and most effective mechanism for improving computer performance. This innovative book exposes the characteristics of performance-optimal single and multi-level cache hierarchies by approaching the cache design process through the novel perspective of minimizing execution times. It presents useful data on the relative performance of a wide spectrum of machines and offers empirical and analytical evaluations of the underlying phenomena. This book will help computer professionals appreciate the impact of ca

  2. Understanding employee motivation and organizational performance: Arguments for a set-theoretic approach

    Directory of Open Access Journals (Sweden)

    Michael T. Lee

    2016-09-01

    Full Text Available Empirical evidence demonstrates that motivated employees mean better organizational performance. The objective of this conceptual paper is to articulate the progress that has been made in understanding employee motivation and organizational performance, and to suggest how the theory concerning employee motivation and organizational performance may be advanced. We acknowledge the existing limitations of theory development and suggest an alternative research approach. Current motivation theory development is based on conventional quantitative analysis (e.g., multiple regression analysis, structural equation modeling. Since researchers are interested in context and understanding of this social phenomena holistically, they think in terms of combinations and configurations of a set of pertinent variables. We suggest that researchers take a set-theoretic approach to complement existing conventional quantitative analysis. To advance current thinking, we propose a set-theoretic approach to leverage employee motivation for organizational performance.

  3. Understanding Performance Management in Schools: A Dialectical Approach

    Science.gov (United States)

    Page, Damien

    2016-01-01

    Purpose: The purpose of this paper is to provide a dialectical framework for the examination of performance management in schools. Design/Methodology/Approach: The paper is based upon a qualitative study of ten headteachers that involved in-depth semi-structured interviews. Findings: The findings identified four dialectical tensions that underpin…

  4. An Efficient Approach for Solving Mesh Optimization Problems Using Newton’s Method

    Directory of Open Access Journals (Sweden)

    Jibum Kim

    2014-01-01

    Full Text Available We present an efficient approach for solving various mesh optimization problems. Our approach is based on Newton’s method, which uses both first-order (gradient and second-order (Hessian derivatives of the nonlinear objective function. The volume and surface mesh optimization algorithms are developed such that mesh validity and surface constraints are satisfied. We also propose several Hessian modification methods when the Hessian matrix is not positive definite. We demonstrate our approach by comparing our method with nonlinear conjugate gradient and steepest descent methods in terms of both efficiency and mesh quality.

  5. New approach to equipment quality evaluation method with distinct functions

    Directory of Open Access Journals (Sweden)

    Milisavljević Vladimir M.

    2016-01-01

    Full Text Available The paper presents new approach for improving method for quality evaluation and selection of equipment (devices and machinery by applying distinct functions. Quality evaluation and selection of devices and machinery is a multi-criteria problem which involves the consideration of numerous parameters of various origins. Original selection method with distinct functions is based on technical parameters with arbitrary evaluation of each parameter importance (weighting. Improvement of this method, presented in this paper, addresses the issue of weighting of parameters by using Delphi Method. Finally, two case studies are provided, which included quality evaluation of standard boilers for heating and evaluation of load-haul-dump (LHD machines, to demonstrate applicability of this approach. Analytical Hierarchical Process (AHP is used as a control method.

  6. How can machine-learning methods assist in virtual screening for hyperuricemia? A healthcare machine-learning approach.

    Science.gov (United States)

    Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi

    2016-12-01

    Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. A multi-method approach to evaluate health information systems.

    Science.gov (United States)

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  8. A new approach for peat inventory methods; Turvetutkimusten menetelmaekehitystarkastelu

    Energy Technology Data Exchange (ETDEWEB)

    Laatikainen, M.; Leino, J.; Lerssi, J.; Torppa, J.; Turunen, J. Email: jukka.turunen@gtk.fi

    2011-07-01

    Development of the new peatland inventory method started in 2009. There was a need to investigate whether new methods and tools could be developed cost-effectively so field inventory work would more completely cover the whole peatland area and the quality and liability of the final results would remain at a high level. The old inventory method in place at the Geological Survey of Finland (GTK) is based on the main transect and cross transect approach across a peatland area. The goal of this study was to find a practical grid-based method linked to the geographic information system suitable for field conditions. the triangle-grid method with even distance between the study points was found to be the most suitable approach. A new Ramac-ground penetrating radar was obtained by the GTK in 2009, and it was concluded in the study of new peatland inventory methods. This radar model is relatively light and very suitable, for example, to the forestry drained peatlands, which are often difficult to cross because of the intensive ditch network. the goal was to investigate the best working methods for the ground penetrating radar to optimize its use in the large-scale peatland inventory. Together with the new field inventory methods, a novel interpolation-based method (MITTI) for modelling peat depths was developed. MITTI makes it possible to take advantage of all the available peat-depth data including, at the moment, aerogeophysical and ground penetrating radar measurements, drilling data and the mire outline. The characteristic uncertainties of each data type are taken into account and, in addition to the depth model itself, an uncertainty map of the model is computed. Combined with the grid-based field inventory method, this multi-approach provides better tools to more accurately estimate the peat depths, peat amounts and peat type distributions. The development of the new peatland inventory method was divided into four separate sections: (1) Development of new field

  9. Clustering based gene expression feature selection method: A computational approach to enrich the classifier efficiency of differentially expressed genes

    KAUST Repository

    Abusamra, Heba; Bajic, Vladimir B.

    2016-01-01

    decrease the computational time and cost, but also improve the classification performance. Among different approaches of feature selection methods, however most of them suffer from several problems such as lack of robustness, validation issues etc. Here, we

  10. [Bath Plug Closure Method for Cerebrospinal Fluid Leakage by Endoscopic Endonasal Approach:Cooperative Treatment by Neurosurgeons and Otolaryngologists].

    Science.gov (United States)

    Kawaguchi, Tomohiro; Arakawa, Kazuya; Nomura, Kazuhiro; Ogawa, Yoshikazu; Katori, Yukio; Tominaga, Teiji

    2017-12-01

    Endoscopic endonasal surgery, an innovative surgical technique, is used to approach sinus lesions, lesions of the skull base, and intradural tumors. The cooperation of experienced otolaryngologists and neurosurgeons is important to achieve safe and reliable surgical results. The bath plug closure method is a treatment option for patients with cerebrospinal fluid(CSF)leakage. Although it includes dural and/or intradural procedures, surgery tends to be performed by otolaryngologists because its indications, detailed maneuvers, and pitfalls are not well recognized by neurosurgeons. We reviewed the cases of patients with CSF leakage treated by using the bath plug closure method with an endoscopic endonasal approach at our institution. Three patients were treated using the bath plug closure method. CSF leakage was caused by a meningocele in two cases and trauma in one case. No postoperative intracranial complications or recurrence of CSF leakage were observed. The bath plug closure method is an effective treatment strategy and allows neurosurgeons to gain in-depth knowledge of the treatment options for CSF leakage by using an endoscopic endonasal approach.

  11. A METHOD TO ESTIMATE TEMPORAL INTERACTION IN A CONDITIONAL RANDOM FIELD BASED APPROACH FOR CROP RECOGNITION

    Directory of Open Access Journals (Sweden)

    P. M. A. Diaz

    2016-06-01

    Full Text Available This paper presents a method to estimate the temporal interaction in a Conditional Random Field (CRF based approach for crop recognition from multitemporal remote sensing image sequences. This approach models the phenology of different crop types as a CRF. Interaction potentials are assumed to depend only on the class labels of an image site at two consecutive epochs. In the proposed method, the estimation of temporal interaction parameters is considered as an optimization problem, whose goal is to find the transition matrix that maximizes the CRF performance, upon a set of labelled data. The objective functions underlying the optimization procedure can be formulated in terms of different accuracy metrics, such as overall and average class accuracy per crop or phenological stages. To validate the proposed approach, experiments were carried out upon a dataset consisting of 12 co-registered LANDSAT images of a region in southeast of Brazil. Pattern Search was used as the optimization algorithm. The experimental results demonstrated that the proposed method was able to substantially outperform estimates related to joint or conditional class transition probabilities, which rely on training samples.

  12. Performance-Based Financing to Strengthen the Health System in Benin: Challenging the Mainstream Approach

    Directory of Open Access Journals (Sweden)

    Elisabeth Paul

    2018-01-01

    Full Text Available Background Performance-based financing (PBF is often proposed as a way to improve health system performance. In Benin, PBF was launched in 2012 through a World Bank-supported project. The Belgian Development Agency (BTC followed suit through a health system strengthening (HSS project. This paper analyses and draws lessons from the experience of BTC-supported PBF alternative approach – especially with regards to institutional aspects, the role of demand-side actors, ownership, and cost-effectiveness – and explores the mechanisms at stake so as to better understand how the “PBF package” functions and produces effects. Methods An exploratory, theory-driven evaluation approach was adopted. Causal mechanisms through which PBF is hypothesised to impact on results were singled out and explored. This paper stems from the co-authors’ capitalisation of experiences; mixed methods were used to collect, triangulate and analyse information. Results are structured along Witter et al framework. Results Influence of context is strong over PBF in Benin; the policy is donor-driven. BTC did not adopt the World Bank’s mainstream PBF model, but developed an alternative approach in line with its HSS support programme, which is grounded on existing domestic institutions. The main features of this approach are described (decentralised governance, peer review verification, counter-verification entrusted to health service users’ platforms, as well as its adaptive process. PBF has contributed to strengthen various aspects of the health system and led to modest progress in utilisation of health services, but noticeable improvements in healthcare quality. Three mechanisms explaining observed outcomes within the context are described: comprehensive HSS at district level; acting on health workers’ motivation through a complex package of incentives; and increased accountability by reinforcing dialogue with demand-side actors. Cost-effectiveness and

  13. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  14. Identifying Key Performance Indicators for Holistic Hospital Management with a Modified DEMATEL Approach.

    Science.gov (United States)

    Si, Sheng-Li; You, Xiao-Yue; Liu, Hu-Chen; Huang, Jia

    2017-08-19

    Performance analysis is an important way for hospitals to achieve higher efficiency and effectiveness in providing services to their customers. The performance of the healthcare system can be measured by many indicators, but it is difficult to improve them simultaneously due to the limited resources. A feasible way is to identify the central and influential indicators to improve healthcare performance in a stepwise manner. In this paper, we propose a hybrid multiple criteria decision making (MCDM) approach to identify key performance indicators (KPIs) for holistic hospital management. First, through integrating evidential reasoning approach and interval 2-tuple linguistic variables, various assessments of performance indicators provided by healthcare experts are modeled. Then, the decision making trial and evaluation laboratory (DEMATEL) technique is adopted to build an interactive network and visualize the causal relationships between the performance indicators. Finally, an empirical case study is provided to demonstrate the proposed approach for improving the efficiency of healthcare management. The results show that "accidents/adverse events", "nosocomial infection", ''incidents/errors", "number of operations/procedures" are significant influential indicators. Also, the indicators of "length of stay", "bed occupancy" and "financial measures" play important roles in performance evaluation of the healthcare organization. The proposed decision making approach could be considered as a reference for healthcare administrators to enhance the performance of their healthcare institutions.

  15. Review of life-cycle approaches coupled with data envelopment analysis: launching the CFP + DEA method for energy policy making.

    Science.gov (United States)

    Vázquez-Rowe, Ian; Iribarren, Diego

    2015-01-01

    Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.

  16. Technical Approach for Determining Key Parameters Needed for Modeling the Performance of Cast Stone for the Integrated Disposal Facility Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Yabusaki, Steven B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Serne, R. Jeffrey [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rockhold, Mark L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Guohui [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Westsik, Joseph H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-03-30

    Washington River Protection Solutions (WRPS) and its contractors at Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) are conducting a development program to develop / refine the cementitious waste form for the wastes treated at the ETF and to provide the data needed to support the IDF PA. This technical approach document is intended to provide guidance to the cementitious waste form development program with respect to the waste form characterization and testing information needed to support the IDF PA. At the time of the preparation of this technical approach document, the IDF PA effort is just getting started and the approach to analyze the performance of the cementitious waste form has not been determined. Therefore, this document looks at a number of different approaches for evaluating the waste form performance and describes the testing needed to provide data for each approach. Though the approach addresses a cementitious secondary aqueous waste form, it is applicable to other waste forms such as Cast Stone for supplemental immobilization of Hanford LAW. The performance of Cast Stone as a physical and chemical barrier to the release of contaminants of concern (COCs) from solidification of Hanford liquid low activity waste (LAW) and secondary wastes processed through the Effluent Treatment Facility (ETF) is of critical importance to the Hanford Integrated Disposal Facility (IDF) total system performance assessment (TSPA). The effectiveness of cementitious waste forms as a barrier to COC release is expected to evolve with time. PA modeling must therefore anticipate and address processes, properties, and conditions that alter the physical and chemical controls on COC transport in the cementitious waste forms over time. Most organizations responsible for disposal facility operation and their regulators support an iterative hierarchical safety/performance assessment approach with a general philosophy that modeling provides

  17. A dynamic approach to real-time performance measurement in design projects

    DEFF Research Database (Denmark)

    Skec, Stanko; Cash, Philip; Storga, Mario

    2017-01-01

    Recent developments in engineering design management point to the need for more dynamic, fine-grain measurement approaches able to deal with multi-dimensional, cross-level process performance in product design. Thus, this paper proposes a new approach to the measurement and management of individu...

  18. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    Science.gov (United States)

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  19. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    Science.gov (United States)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  20. A novel approach to describing and detecting performance anti-patterns

    Science.gov (United States)

    Sheng, Jinfang; Wang, Yihan; Hu, Peipei; Wang, Bin

    2017-08-01

    Anti-pattern, as an extension to pattern, describes a widely used poor solution which can bring negative influence to application systems. Aiming at the shortcomings of the existing anti-pattern descriptions, an anti-pattern description method based on first order predicate is proposed. This method synthesizes anti-pattern forms and symptoms, which makes the description more accurate and has good scalability and versatility as well. In order to improve the accuracy of anti-pattern detection, a Bayesian classification method is applied in validation for detection results, which can reduce false negatives and false positives of anti-pattern detection. Finally, the proposed approach in this paper is applied to a small e-commerce system, the feasibility and effectiveness of the approach is demonstrated further through experiments.

  1. On analyzing colour constancy approach for improving SURF detector performance

    Science.gov (United States)

    Zulkiey, Mohd Asyraf; Zaki, Wan Mimi Diyana Wan; Hussain, Aini; Mustafa, Mohd. Marzuki

    2012-04-01

    Robust key point detector plays a crucial role in obtaining a good tracking feature. The main challenge in outdoor tracking is the illumination change due to various reasons such as weather fluctuation and occlusion. This paper approaches the illumination change problem by transforming the input image through colour constancy algorithm before applying the SURF detector. Masked grey world approach is chosen because of its ability to perform well under local as well as global illumination change. Every image is transformed to imitate the canonical illuminant and Gaussian distribution is used to model the global change. The simulation results show that the average number of detected key points have increased by 69.92%. Moreover, the average of improved performance cases far out weight the degradation case where the former is improved by 215.23%. The approach is suitable for tracking implementation where sudden illumination occurs frequently and robust key point detection is needed.

  2. Improving the road wear performance of heavy vehicles in South Africa using a performance-based standards approach

    CSIR Research Space (South Africa)

    Nordengen, Paul A

    2010-05-01

    Full Text Available of the world to achieve regional harmonisation and effective road use have had limited success. Another approach is to consider performance-based standards (PBS); in this case standards specify the performance required from the operation of a vehicle on a...

  3. Perceptual-cognitive expertise in sport: some considerations when applying the expert performance approach.

    Science.gov (United States)

    Williams, A Mark; Ericsson, K Anders

    2005-06-01

    The number of researchers studying perceptual-cognitive expertise in sport is increasing. The intention in this paper is to review the currently accepted framework for studying expert performance and to consider implications for undertaking research work in the area of perceptual-cognitive expertise in sport. The expert performance approach presents a descriptive and inductive approach for the systematic study of expert performance. The nature of expert performance is initially captured in the laboratory using representative tasks that identify reliably superior performance. Process-tracing measures are employed to determine the mechanisms that mediate expert performance on the task. Finally, the specific types of activities that lead to the acquisition and development of these mediating mechanisms are identified. General principles and mechanisms may be discovered and then validated by more traditional experimental designs. The relevance of this approach to the study of perceptual-cognitive expertise in sport is discussed and suggestions for future work highlighted.

  4. Parallel implementation and performance optimization of the configuration-interaction method

    Energy Technology Data Exchange (ETDEWEB)

    Shan, H; Williams, S; Johnson, C; McElvain, K; Ormand, WE

    2015-11-20

    The configuration-interaction (CI) method, long a popular approach to describe quantum many-body systems, is cast as a very large sparse matrix eigenpair problem with matrices whose dimension can exceed one billion. Such formulations place high demands on memory capacity and memory bandwidth - - two quantities at a premium today. In this paper, we describe an efficient, scalable implementation, BIGSTICK, which, by factorizing both the basis and the interaction into two levels, can reconstruct the nonzero matrix elements on the fly, reduce the memory requirements by one or two orders of magnitude, and enable researchers to trade reduced resources for increased computational time. We optimize BIGSTICK on two leading HPC platforms - - the Cray XC30 and the IBM Blue Gene/Q. Specifically, we not only develop an empirically-driven load balancing strategy that can evenly distribute the matrix-vector multiplication across 256K threads, we also developed techniques that improve the performance of the Lanczos reorthogonalization. Combined, these optimizations improved performance by 1.3-8× depending on platform and configuration.

  5. Unsupervised machine-learning method for improving the performance of ambulatory fall-detection systems

    Directory of Open Access Journals (Sweden)

    Yuwono Mitchell

    2012-02-01

    Full Text Available Abstract Background Falls can cause trauma, disability and death among older people. Ambulatory accelerometer devices are currently capable of detecting falls in a controlled environment. However, research suggests that most current approaches can tend to have insufficient sensitivity and specificity in non-laboratory environments, in part because impacts can be experienced as part of ordinary daily living activities. Method We used a waist-worn wireless tri-axial accelerometer combined with digital signal processing, clustering and neural network classifiers. The method includes the application of Discrete Wavelet Transform, Regrouping Particle Swarm Optimization, Gaussian Distribution of Clustered Knowledge and an ensemble of classifiers including a multilayer perceptron and Augmented Radial Basis Function (ARBF neural networks. Results Preliminary testing with 8 healthy individuals in a home environment yields 98.6% sensitivity to falls and 99.6% specificity for routine Activities of Daily Living (ADL data. Single ARB and MLP classifiers were compared with a combined classifier. The combined classifier offers the greatest sensitivity, with a slight reduction in specificity for routine ADL and an increased specificity for exercise activities. In preliminary tests, the approach achieves 100% sensitivity on in-group falls, 97.65% on out-group falls, 99.33% specificity on routine ADL, and 96.59% specificity on exercise ADL. Conclusion The pre-processing and feature-extraction steps appear to simplify the signal while successfully extracting the essential features that are required to characterize a fall. The results suggest this combination of classifiers can perform better than MLP alone. Preliminary testing suggests these methods may be useful for researchers who are attempting to improve the performance of ambulatory fall-detection systems.

  6. Explaining the Evolution of Performance Measures - A Dual Case-Study Approach

    Directory of Open Access Journals (Sweden)

    Mohammed Salloum

    2013-07-01

    Full Text Available Few empirical studies have examined how performance measures change in practice and the driving forces behind this change. The existing body of literature has taken a prescriptive approach to how managers and organisations ought to manage change in performance measures without any concern for studying the phenomenon itself and thus a theoretical gap exists. With this gap in mind, the purpose of this paper is to outline how and why the performance measures have changed at two case companies over the time period 2008-2011. In order to fulfil the purpose of this paper two case studies at two different case companies have been conducted. The choice of data collection method is justified by the ambition to attain an in-depth and holistic understanding of the phenomenon. For each case, the data collection was based on four components: an interview study, analysis of archived data, documentation and direct observations. In total, 28 interviews were conducted, 14 at each case company. The empirical findings exhibit that the performance measures are exposed to continuous and considerable change from several perspectives. The measurement scopes at both case companies are steadily expanding, the individual performance measures are constantly replaced and their characteristics are continuously altered. An array of change triggers has been identified in the empirical findings. In contrast to what is advocated in literature, the findings illustrate that the most frequent reason for change is the will to improve the performance measures, the measurement process and the overall performance rather than changing internal and external environments. There are several challenges that need to be addressed in the future research agenda.

  7. Newton’s method an updated approach of Kantorovich’s theory

    CERN Document Server

    Ezquerro Fernández, José Antonio

    2017-01-01

    This book shows the importance of studying semilocal convergence in iterative methods through Newton's method and addresses the most important aspects of the Kantorovich's theory including implicated studies. Kantorovich's theory for Newton's method used techniques of functional analysis to prove the semilocal convergence of the method by means of the well-known majorant principle. To gain a deeper understanding of these techniques the authors return to the beginning and present a deep-detailed approach of Kantorovich's theory for Newton's method, where they include old results, for a historical perspective and for comparisons with new results, refine old results, and prove their most relevant results, where alternative approaches leading to new sufficient semilocal convergence criteria for Newton's method are given. The book contains many numerical examples involving nonlinear integral equations, two boundary value problems and systems of nonlinear equations related to numerous physical phenomena. The book i...

  8. Comparing the performance of biomedical clustering methods

    DEFF Research Database (Denmark)

    Wiwie, Christian; Baumbach, Jan; Röttger, Richard

    2015-01-01

    expression to protein domains. Performance was judged on the basis of 13 common cluster validity indices. We developed a clustering analysis platform, ClustEval (http://clusteval.mpi-inf.mpg.de), to promote streamlined evaluation, comparison and reproducibility of clustering results in the future......Identifying groups of similar objects is a popular first step in biomedical data analysis, but it is error-prone and impossible to perform manually. Many computational methods have been developed to tackle this problem. Here we assessed 13 well-known methods using 24 data sets ranging from gene....... This allowed us to objectively evaluate the performance of all tools on all data sets with up to 1,000 different parameter sets each, resulting in a total of more than 4 million calculated cluster validity indices. We observed that there was no universal best performer, but on the basis of this wide...

  9. The Meaning of Anti-Americanism: A Performative Approach to Anti-American Prejudice

    Directory of Open Access Journals (Sweden)

    Felix Knappertsbusch

    2013-06-01

    Full Text Available A contribution to the ongoing debate on how anti-Americanism can be adequately conceptualized and how such prejudice can be distinguished from legitimate criticism, arguing that part of these conceptual problems arise from a too narrow focus on defining anti-Americanism and the use of standardized empirical operationalizations. Such approaches exhibit severe limitations in grasping the flexibility of the phenomenon in everyday discourse and often underestimate or ignore the interpretive aspect involved in identifying utterances as anti-American prejudice. Alternatively, a performative approach is proposed, understanding anti-Americanism as a network of speech acts bound by family resemblance rather than identical features. In combination with qualitative empirical research methods such a conceptualization is especially suited to account for the flexible, situated use of anti-American utterances. At the same time it grants reflexivity to the research concept, in the sense of a close description of the scientific application of the notion of anti-Americanism. Two empirical examples from an interview study on anti-American speech in Germany illustrate the potential of such an approach, providing an insight into how anti-Americanism is incorporated into the construction and expression of racist and revisionist national identifications in everyday discourse.

  10. METHOD FOR SELECTION OF PROJECT MANAGEMENT APPROACH BASED ON FUZZY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Igor V. KONONENKO

    2017-03-01

    Full Text Available Literature analysis of works that devoted to research of the selection a project management approach and development of effective methods for this problem solution is given. Mathematical model and method for selection of project management approach with fuzzy concepts of applicability of existing approaches are proposed. The selection is made of such approaches as the PMBOK Guide, the ISO21500 standard, the PRINCE2 methodology, the SWEBOK Guide, agile methodologies Scrum, XP, and Kanban. The number of project parameters which have a great impact on the result of the selection and measure of their impact is determined. Project parameters relate to information about the project, team, communication, critical project risks. They include the number of people involved in the project, the customer's experience with this project team, the project team's experience in this field, the project team's understanding of requirements, adapting ability, initiative, and others. The suggested method is considered on the example of its application for selection a project management approach to software development project.

  11. Stability over Time of Different Methods of Estimating School Performance

    Science.gov (United States)

    Dumay, Xavier; Coe, Rob; Anumendem, Dickson Nkafu

    2014-01-01

    This paper aims to investigate how stability varies with the approach used in estimating school performance in a large sample of English primary schools. The results show that (a) raw performance is considerably more stable than adjusted performance, which in turn is slightly more stable than growth model estimates; (b) schools' performance…

  12. A Comprehensive Approach in Assessing the Performance of an Automobile Closed-Loop Supply Chain

    Directory of Open Access Journals (Sweden)

    Ezutah Udoncy Olugu

    2010-03-01

    Full Text Available The ecological issues arising from manufacturing operations have led to the focus on environmental sustainability in manufacturing. This can be addressed adequately using a closed-loop supply chain (CLSC. To attain an effective and efficient CLSC, it is necessary to imbibe a holistic performance measurement approach. In order to achieve this, there is a need to adopt a specific approach for a particular product rather than being generic. Since sustainability has direct environmental footprints that involve organizational stakeholders, suppliers, customers and the society at large, complexities surrounding supply chain performance measurement have multiplied. In this study, a suitable approach has been proposed for CLSC performance measurement in the automotive industry, based on reviewed literature. It is believed that this approach will result in increased effectiveness and efficiency in CLSC performance measurement.

  13. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods

    Directory of Open Access Journals (Sweden)

    Gavin J. Nixon

    2014-12-01

    Full Text Available Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR. There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These ‘isothermal’ methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT, akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  14. A Novel Method of Adrenal Venous Sampling via an Antecubital Approach

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Xiongjing, E-mail: jxj103@hotmail.com; Dong, Hui; Peng, Meng; Che, Wuqiang; Zou, Yubao; Song, Lei; Zhang, Huimin; Wu, Haiying [Chinese Academy of Medical Sciences and Peking Union Medical College, Department of Cardiology, Fuwai Hospital, National Center for Cardiovascular Disease (China)

    2017-03-15

    PurposeCurrently, almost all adrenal venous sampling (AVS) procedures are performed by femoral vein access. The purpose of this study was to establish the technique of AVS via an antecubital approach and evaluate its safety and feasibility.Materials and MethodsFrom January 2012 to June 2015, 194 consecutive patients diagnosed as primary aldosteronism underwent AVS via an antecubital approach without ACTH simulation. Catheters used for bilateral adrenal cannulations were recorded. The success rate of bilateral adrenal sampling, operation time, fluoroscopy time, dosage of contrast, and incidence of complications were calculated.ResultsA 5F MPA1 catheter was first used to attempt right adrenal cannulation in all patients. Cannulation of the right adrenal vein was successfully performed in 164 (84.5%) patients. The 5F JR5, Cobra2, and TIG catheters were the ultimate catheters for right adrenal cannulation in 16 (8.2%), 5 (2.6%), and 9 (4.6%) patients, respectively. For left adrenal cannulation, JR5 and Cobra2 catheters were used in 19 (9.8%) and 10 (5.2%) patients, respectively, while only TIG catheters were used in the remaining 165 (85.1%) patients. The rate of successful adrenal sampling on the right, left, and bilateral sides was 91.8%, 93.3%, and 87.6%, respectively. The mean time of operation was (16.3 ± 4.3) minutes, mean fluoroscopy time was (4.7 ± 1.3) minutes, and the mean use of contrast was (14.3 ± 4.7) ml. The incidence of adrenal hematoma was 1.0%.ConclusionsThis study showed that AVS via an antecubital approach was safe and feasible, with a high rate of successful sampling.

  15. Helicopter Gas Turbine Engine Performance Analysis : A Multivariable Approach

    NARCIS (Netherlands)

    Arush, Ilan; Pavel, M.D.

    2017-01-01

    Helicopter performance relies heavily on the available output power of the engine(s) installed. A simplistic single-variable analysis approach is often used within the flight-testing community to reduce raw flight-test data in order to predict the available output power under different atmospheric

  16. Disentangling task and contextual performance : a multitrait-multimethod approach

    NARCIS (Netherlands)

    Demerouti, E.; Xanthopoulou, D.; Tsaousis, I.; Bakker, A.B.

    2014-01-01

    This study among 244 employees and their colleagues working in various sectors investigated the dimensionality of self-ratings and peer-ratings of task and contextual performance, using the scales of Goodman and Svyantek (1999). By applying the multitrait-multimethod approach, we examined the degree

  17. A method for optimizing the performance of buildings

    Energy Technology Data Exchange (ETDEWEB)

    Pedersen, Frank

    2006-07-01

    This thesis describes a method for optimizing the performance of buildings. Design decisions made in early stages of the building design process have a significant impact on the performance of buildings, for instance, the performance with respect to the energy consumption, economical aspects, and the indoor environment. The method is intended for supporting design decisions for buildings, by combining methods for calculating the performance of buildings with numerical optimization methods. The method is able to find optimum values of decision variables representing different features of the building, such as its shape, the amount and type of windows used, and the amount of insulation used in the building envelope. The parties who influence design decisions for buildings, such as building owners, building users, architects, consulting engineers, contractors, etc., often have different and to some extent conflicting requirements to buildings. For instance, the building owner may be more concerned about the cost of constructing the building, rather than the quality of the indoor climate, which is more likely to be a concern of the building user. In order to support the different types of requirements made by decision-makers for buildings, an optimization problem is formulated, intended for representing a wide range of design decision problems for buildings. The problem formulation involves so-called performance measures, which can be calculated with simulation software for buildings. For instance, the annual amount of energy required by the building, the cost of constructing the building, and the annual number of hours where overheating occurs, can be used as performance measures. The optimization problem enables the decision-makers to specify many different requirements to the decision variables, as well as to the performance of the building. Performance measures can for instance be required to assume their minimum or maximum value, they can be subjected to upper or

  18. Book Review: Comparative Education Research: Approaches and Methods

    Directory of Open Access Journals (Sweden)

    Noel Mcginn

    2014-10-01

    Full Text Available Book Review Comparative Education Research: Approaches and Methods (2nd edition By Mark Bray, Bob Adamson and Mark Mason (Eds. (2014, 453p ISBN: 978-988-17852-8-2, Hong Kong: Comparative Education Research Centre and Springer

  19. Methodical approaches to development of classification state methods of regulation business activity in fishery

    OpenAIRE

    She Son Gun

    2014-01-01

    Approaches to development of classification of the state methods of regulation of economy are considered. On the basis of the provided review the complex method of state regulation of business activity is reasonable. The offered principles allow improving public administration and can be used in industry concepts and state programs on support of small business in fishery.

  20. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    Science.gov (United States)

    Lee, G.; Jun, K. S.; Chung, E.-S.

    2015-04-01

    This study proposes an improved group decision making (GDM) framework that combines the VIKOR method with data fuzzification to quantify the spatial flood vulnerability including multiple criteria. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. This approach effectively can propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the southern Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the spatial flood vulnerability using general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods (i.e., Borda, Condorcet, and Copeland). As a result, the proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  1. Statistical learning methods: Basics, control and performance

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de

    2006-04-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms.

  2. Statistical learning methods: Basics, control and performance

    International Nuclear Information System (INIS)

    Zimmermann, J.

    2006-01-01

    The basics of statistical learning are reviewed with a special emphasis on general principles and problems for all different types of learning methods. Different aspects of controlling these methods in a physically adequate way will be discussed. All principles and guidelines will be exercised on examples for statistical learning methods in high energy and astrophysics. These examples prove in addition that statistical learning methods very often lead to a remarkable performance gain compared to the competing classical algorithms

  3. Safety assessment in plant layout design using indexing approach: implementing inherent safety perspective. Part 1 - guideword applicability and method description.

    Science.gov (United States)

    Tugnoli, Alessandro; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio

    2008-12-15

    Layout planning plays a key role in the inherent safety performance of process plants since this design feature controls the possibility of accidental chain-events and the magnitude of possible consequences. A lack of suitable methods to promote the effective implementation of inherent safety in layout design calls for the development of new techniques and methods. In the present paper, a safety assessment approach suitable for layout design in the critical early phase is proposed. The concept of inherent safety is implemented within this safety assessment; the approach is based on an integrated assessment of inherent safety guideword applicability within the constraints typically present in layout design. Application of these guidewords is evaluated along with unit hazards and control devices to quantitatively map the safety performance of different layout options. Moreover, the economic aspects related to safety and inherent safety are evaluated by the method. Specific sub-indices are developed within the integrated safety assessment system to analyze and quantify the hazard related to domino effects. The proposed approach is quick in application, auditable and shares a common framework applicable in other phases of the design lifecycle (e.g. process design). The present work is divided in two parts: Part 1 (current paper) presents the application of inherent safety guidelines in layout design and the index method for safety assessment; Part 2 (accompanying paper) describes the domino hazard sub-index and demonstrates the proposed approach with a case study, thus evidencing the introduction of inherent safety features in layout design.

  4. Gradient High Performance Liquid Chromatography Method ...

    African Journals Online (AJOL)

    Purpose: To develop a gradient high performance liquid chromatography (HPLC) method for the simultaneous determination of phenylephrine (PHE) and ibuprofen (IBU) in solid ..... nimesulide, phenylephrine. Hydrochloride, chlorpheniramine maleate and caffeine anhydrous in pharmaceutical dosage form. Acta Pol.

  5. Striving for Excellence Sometimes Hinders High Achievers: Performance-Approach Goals Deplete Arithmetical Performance in Students with High Working Memory Capacity

    Science.gov (United States)

    Crouzevialle, Marie; Smeding, Annique; Butera, Fabrizio

    2015-01-01

    We tested whether the goal to attain normative superiority over other students, referred to as performance-approach goals, is particularly distractive for high-Working Memory Capacity (WMC) students—that is, those who are used to being high achievers. Indeed, WMC is positively related to high-order cognitive performance and academic success, a record of success that confers benefits on high-WMC as compared to low-WMC students. We tested whether such benefits may turn out to be a burden under performance-approach goal pursuit. Indeed, for high achievers, aiming to rise above others may represent an opportunity to reaffirm their positive status—a stake susceptible to trigger disruptive outcome concerns that interfere with task processing. Results revealed that with performance-approach goals—as compared to goals with no emphasis on social comparison—the higher the students’ WMC, the lower their performance at a complex arithmetic task (Experiment 1). Crucially, this pattern appeared to be driven by uncertainty regarding the chances to outclass others (Experiment 2). Moreover, an accessibility measure suggested the mediational role played by status-related concerns in the observed disruption of performance. We discuss why high-stake situations can paradoxically lead high-achievers to sub-optimally perform when high-order cognitive performance is at play. PMID:26407097

  6. Multicriterial ranking approach for evaluating bank branch performance

    NARCIS (Netherlands)

    Aleskerov, F; Ersel, H; Yolalan, R

    14 ranking methods based on multiple criteria are suggested for evaluating the performance of the bank branches. The methods are explained via an illustrative example, and some of them are applied to a real-life data for 23 retail bank branches in a large-scale private Turkish commercial bank.

  7. Climate Prediction for Brazil's Nordeste: Performance of Empirical and Numerical Modeling Methods.

    Science.gov (United States)

    Moura, Antonio Divino; Hastenrath, Stefan

    2004-07-01

    Comparisons of performance of climate forecast methods require consistency in the predictand and a long common reference period. For Brazil's Nordeste, empirical methods developed at the University of Wisconsin use preseason (October January) rainfall and January indices of the fields of meridional wind component and sea surface temperature (SST) in the tropical Atlantic and the equatorial Pacific as input to stepwise multiple regression and neural networking. These are used to predict the March June rainfall at a network of 27 stations. An experiment at the International Research Institute for Climate Prediction, Columbia University, with a numerical model (ECHAM4.5) used global SST information through February to predict the March June rainfall at three grid points in the Nordeste. The predictands for the empirical and numerical model forecasts are correlated at +0.96, and the period common to the independent portion of record of the empirical prediction and the numerical modeling is 1968 99. Over this period, predicted versus observed rainfall are evaluated in terms of correlation, root-mean-square error, absolute error, and bias. Performance is high for both approaches. Numerical modeling produces a correlation of +0.68, moderate errors, and strong negative bias. For the empirical methods, errors and bias are small, and correlations of +0.73 and +0.82 are reached between predicted and observed rainfall.

  8. Diagnosis of Feedwater Heater Performance Degradation using Fuzzy Approach

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Kang, Yeon Kwan; Heo, Gyunyoung; Song, Seok Yoon

    2014-01-01

    It is inevitable to avoid degradation of component, which operates continuously for long time in harsh environment. Since this degradation causes economical loss and human loss, it is important to monitor and diagnose the degradation of component. The diagnosis requires a well-systematic method for timely decision. Before this article, the methods using regression model and diagnosis table have been proposed to perform the diagnosis study for thermal efficiency in Nuclear Power Plants (NPPs). Since the regression model was numerically less-stable under changes of operating variables, it was difficult to provide good results in operating plants. Contrary to this, the diagnosis table was hard to use due to ambiguous points and to detect how it affects degradation. In order to cover the issues of previous researches, we proposed fuzzy approaches and applied it to diagnose Feedwater Heater (FWH) degradation to check the feasibility. The degradation of FWHs is not easy to be observed, while trouble such as tube leakage may bring simultaneous damage to the tube bundle. This study explains the steps of diagnosing typical failure modes of FWHs. In order to cover the technical issues of previous researches, we adopted fuzzy logic to suggest a diagnosis algorithm for the degradation of FHWs and performed feasibility study. In this paper, total 7 modes of FWH degradation modes are considered, which are High Drain Level, Low Shell Pressure, Tube Pressure Increase, Tube Fouling, Pass Partition Plate Leakage, Tube Leakage, Abnormal venting. From the literature survey and simulation, diagnosis table for FWH is made. We apply fuzzy logic based on diagnosis table. Authors verify fuzzy diagnosis for FWH degradation synthesized the random input sets from made diagnosis table. Comparing previous researches, suggested method more-stable under changes of operating variables, than regression model. On the contrary, the problem which ambiguous points and detect how it affects degradation

  9. Diagnosis of Feedwater Heater Performance Degradation using Fuzzy Approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Kang, Yeon Kwan; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Song, Seok Yoon [Korea Hydro and Nuclear Power, Daejeon (Korea, Republic of)

    2014-05-15

    It is inevitable to avoid degradation of component, which operates continuously for long time in harsh environment. Since this degradation causes economical loss and human loss, it is important to monitor and diagnose the degradation of component. The diagnosis requires a well-systematic method for timely decision. Before this article, the methods using regression model and diagnosis table have been proposed to perform the diagnosis study for thermal efficiency in Nuclear Power Plants (NPPs). Since the regression model was numerically less-stable under changes of operating variables, it was difficult to provide good results in operating plants. Contrary to this, the diagnosis table was hard to use due to ambiguous points and to detect how it affects degradation. In order to cover the issues of previous researches, we proposed fuzzy approaches and applied it to diagnose Feedwater Heater (FWH) degradation to check the feasibility. The degradation of FWHs is not easy to be observed, while trouble such as tube leakage may bring simultaneous damage to the tube bundle. This study explains the steps of diagnosing typical failure modes of FWHs. In order to cover the technical issues of previous researches, we adopted fuzzy logic to suggest a diagnosis algorithm for the degradation of FHWs and performed feasibility study. In this paper, total 7 modes of FWH degradation modes are considered, which are High Drain Level, Low Shell Pressure, Tube Pressure Increase, Tube Fouling, Pass Partition Plate Leakage, Tube Leakage, Abnormal venting. From the literature survey and simulation, diagnosis table for FWH is made. We apply fuzzy logic based on diagnosis table. Authors verify fuzzy diagnosis for FWH degradation synthesized the random input sets from made diagnosis table. Comparing previous researches, suggested method more-stable under changes of operating variables, than regression model. On the contrary, the problem which ambiguous points and detect how it affects degradation

  10. An Efficient Taguchi Approach for the Performance Optimization of Health, Safety, Environment and Ergonomics in Generation Companies.

    Science.gov (United States)

    Azadeh, Ali; Sheikhalishahi, Mohammad

    2015-06-01

    A unique framework for performance optimization of generation companies (GENCOs) based on health, safety, environment, and ergonomics (HSEE) indicators is presented. To rank this sector of industry, the combination of data envelopment analysis (DEA), principal component analysis (PCA), and Taguchi are used for all branches of GENCOs. These methods are applied in an integrated manner to measure the performance of GENCO. The preferred model between DEA, PCA, and Taguchi is selected based on sensitivity analysis and maximum correlation between rankings. To achieve the stated objectives, noise is introduced into input data. The results show that Taguchi outperforms other methods. Moreover, a comprehensive experiment is carried out to identify the most influential factor for ranking GENCOs. The approach developed in this study could be used for continuous assessment and improvement of GENCO's performance in supplying energy with respect to HSEE factors. The results of such studies would help managers to have better understanding of weak and strong points in terms of HSEE factors.

  11. The integrated application of the AHP and DEA methods in evaluating the performances of higher education institutions in the Republic of Serbia

    Directory of Open Access Journals (Sweden)

    Predrag Mimovic

    2016-04-01

    Full Text Available The measurement and evaluation of performance are critical for the efficient and effective functioning of the economic system, because this allows for the analysis of the extent to which the defined objectives are achieved. Organizational performance is measured by different methods, both quantitative and qualitative. Many of the known methods for the evaluation and measurement of organizational performance take into account only financial indicators, while ignoring the non-financial ones. The integration of both indicators, through the combined application of multiple methods and the comparison of their results, should provide a more complete and objective picture of organizational performance. The Analytic Hierarchy Process (AHP is a formal framework for solving complex decision-making problems, as well as a systemic procedure for the hierarchical presentation of the problem elements. The Data Envelopment Analysis (DEA is a non-parametric approach based on linear programming, which allows for the calculation of the efficiency of decision-making units within a group of organizations. The work is an illustration of the method and framework of the combined use of the multi-criteria analysis methods for the measurement and evaluation of the performance of higher education institutions in the Republic of Serbia. The advantages of this approach are reflected in overcoming the shortcomings of a partial application of the AHP and the DEA methods by utilizing a new, hybrid, DEAHP (Data Envelopment Analytic Hierarchy Process method. Performance evaluation through an integrated application of the AHP and the DEA methods provides more objective results and more reliable solutions to the observed problem, thus creating a valuable information base for high-quality strategic decision making in higher education institutions, both at the national level and at the level of individual institutions.

  12. Validated High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop a simple, rapid and sensitive high performance liquid chromatography (HPLC) method for the determination of cefadroxil monohydrate in human plasma. Methods: Schimadzu HPLC with LC solution software was used with Waters Spherisorb, C18 (5 μm, 150mm × 4.5mm) column. The mobile phase ...

  13. Qualitative approaches to use of the RE-AIM framework: rationale and methods.

    Science.gov (United States)

    Holtrop, Jodi Summers; Rabin, Borsika A; Glasgow, Russell E

    2018-03-13

    There have been over 430 publications using the RE-AIM model for planning and evaluation of health programs and policies, as well as numerous applications of the model in grant proposals and national programs. Full use of the model includes use of qualitative methods to understand why and how results were obtained on different RE-AIM dimensions, however, recent reviews have revealed that qualitative methods have been used infrequently. Having quantitative and qualitative methods and results iteratively inform each other should enhance understanding and lessons learned. Because there have been few published examples of qualitative approaches and methods using RE-AIM for planning or assessment and no guidance on how qualitative approaches can inform these processes, we provide guidance on qualitative methods to address the RE-AIM model and its various dimensions. The intended audience is researchers interested in applying RE-AIM or similar implementation models, but the methods discussed should also be relevant to those in community or clinical settings. We present directions for, examples of, and guidance on how qualitative methods can be used to address each of the five RE-AIM dimensions. Formative qualitative methods can be helpful in planning interventions and designing for dissemination. Summative qualitative methods are useful when used in an iterative, mixed methods approach for understanding how and why different patterns of results occur. In summary, qualitative and mixed methods approaches to RE-AIM help understand complex situations and results, why and how outcomes were obtained, and contextual factors not easily assessed using quantitative measures.

  14. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S; Antoniou, I; Dahlberg, J A [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  15. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    Science.gov (United States)

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  16. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  17. Neural Networks Method in modeling of the financial company’s performance

    Directory of Open Access Journals (Sweden)

    I. P. Kurochkina

    2017-01-01

    Full Text Available The content of modern management accounting is formed in conjunction with the rapid development of information technology, using complex algorithms of economic analysis. It makes possible the practical realization of the effective management idea - management of key performance indicators, which certainly includes the indicators of financial performance of economic entities.An important place in this process is given to the construction and calculation of factorial systems of economic indicators. A substantial theoretical and empirical experience has been accumulated to solve the problems that arise. The aim of this study is to develop a universal modern model for factor analysis of finance results, allowing multivariate solutions both current and promising character with monitoring in real time.The realization of this goal is achievable by using artificial neural networks (ANN in an appropriate simulation, which are increasingly used in the economy as a tool for supporting management decisionmaking. In comparison with classical deterministic and stochastic models, ANN brings the intellectual component to the modeling process. They are able to learn to function based on the gained experience, the result of allowing less and less mistakes.The article reveals the advantages of such an alternative approach. An alternative approach to factor analysis, based on the method of neural networks, is proposed. Advantages of this approach are marked. The paper presents a phased algorithm of modeling complex cause-and-effect nature relationships, including factors’ selection for the studied result, the creation of the neural network architecture and its training. The universality of such modeling lies in the fact that it can be used for any resulting indicator.The authors have proposed and described a mathematical model of the factor analysis for financial indicators. It is important that the model included the factors of both direct and indirect actions

  18. Experiential Approach to Teaching Statistics and Research Methods ...

    African Journals Online (AJOL)

    Statistics and research methods are among the more demanding topics for students of education to master at both the undergraduate and postgraduate levels. It is our conviction that teaching these topics should be combined with real practical experiences. We discuss an experiential teaching/ learning approach that ...

  19. A Web-Based Peer-Assessment Approach to Improving Junior High School Students' Performance, Self-Efficacy and Motivation in Performing Arts Courses

    Science.gov (United States)

    Hsia, Lu-Ho; Huang, Iwen; Hwang, Gwo-Jen

    2016-01-01

    In this paper, a web-based peer-assessment approach is proposed for conducting performing arts activities. A peer-assessment system was implemented and applied to a junior high school performing arts course to evaluate the effectiveness of the proposed approach. A total of 163 junior high students were assigned to an experimental group and a…

  20. Actively Teaching Research Methods with a Process Oriented Guided Inquiry Learning Approach

    Science.gov (United States)

    Mullins, Mary H.

    2017-01-01

    Active learning approaches have shown to improve student learning outcomes and improve the experience of students in the classroom. This article compares a Process Oriented Guided Inquiry Learning style approach to a more traditional teaching method in an undergraduate research methods course. Moving from a more traditional learning environment to…

  1. Performance Evaluation of an Object Management Policy Approach for P2P Networks

    Directory of Open Access Journals (Sweden)

    Dario Vieira

    2012-01-01

    Full Text Available The increasing popularity of network-based multimedia applications poses many challenges for content providers to supply efficient and scalable services. Peer-to-peer (P2P systems have been shown to be a promising approach to provide large-scale video services over the Internet since, by nature, these systems show high scalability and robustness. In this paper, we propose and analyze an object management policy approach for video web cache in a P2P context, taking advantage of object's metadata, for example, video popularity, and object's encoding techniques, for example, scalable video coding (SVC. We carry out trace-driven simulations so as to evaluate the performance of our approach and compare it against traditional object management policy approaches. In addition, we study as well the impact of churn on our approach and on other object management policies that implement different caching strategies. A YouTube video collection which records over 1.6 million video's log was used in our experimental studies. The experiment results have showed that our proposed approach can improve the performance of the cache substantially. Moreover, we have found that neither the simply enlargement of peers' storage capacity nor a zero replicating strategy is effective actions to improve performance of an object management policy.

  2. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  3. Performance prediction method for a multi-stage Knudsen pump

    Science.gov (United States)

    Kugimoto, K.; Hirota, Y.; Kizaki, Y.; Yamaguchi, H.; Niimi, T.

    2017-12-01

    In this study, the novel method to predict the performance of a multi-stage Knudsen pump is proposed. The performance prediction method is carried out in two steps numerically with the assistance of a simple experimental result. In the first step, the performance of a single-stage Knudsen pump was measured experimentally under various pressure conditions, and the relationship of the mass flow rate was obtained with respect to the average pressure between the inlet and outlet of the pump and the pressure difference between them. In the second step, the performance of a multi-stage pump was analyzed by a one-dimensional model derived from the mass conservation law. The performances predicted by the 1D-model of 1-stage, 2-stage, 3-stage, and 4-stage pumps were validated by the experimental results for the corresponding number of stages. It was concluded that the proposed prediction method works properly.

  4. Evaluating the performance of the particle finite element method in parallel architectures

    Science.gov (United States)

    Gimenez, Juan M.; Nigro, Norberto M.; Idelsohn, Sergio R.

    2014-05-01

    This paper presents a high performance implementation for the particle-mesh based method called particle finite element method two (PFEM-2). It consists of a material derivative based formulation of the equations with a hybrid spatial discretization which uses an Eulerian mesh and Lagrangian particles. The main aim of PFEM-2 is to solve transport equations as fast as possible keeping some level of accuracy. The method was found to be competitive with classical Eulerian alternatives for these targets, even in their range of optimal application. To evaluate the goodness of the method with large simulations, it is imperative to use of parallel environments. Parallel strategies for Finite Element Method have been widely studied and many libraries can be used to solve Eulerian stages of PFEM-2. However, Lagrangian stages, such as streamline integration, must be developed considering the parallel strategy selected. The main drawback of PFEM-2 is the large amount of memory needed, which limits its application to large problems with only one computer. Therefore, a distributed-memory implementation is urgently needed. Unlike a shared-memory approach, using domain decomposition the memory is automatically isolated, thus avoiding race conditions; however new issues appear due to data distribution over the processes. Thus, a domain decomposition strategy for both particle and mesh is adopted, which minimizes the communication between processes. Finally, performance analysis running over multicore and multinode architectures are presented. The Courant-Friedrichs-Lewy number used influences the efficiency of the parallelization and, in some cases, a weighted partitioning can be used to improve the speed-up. However the total cputime for cases presented is lower than that obtained when using classical Eulerian strategies.

  5. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    Science.gov (United States)

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  6. A PRACTICAL APPROACH TO THE GROUND OSCILLATION VELOCITY MEASUREMENT METHOD

    Directory of Open Access Journals (Sweden)

    Siniša Stanković

    2017-01-01

    Full Text Available The use of an explosive’s energy during blasting includes undesired effects on the environment. The seismic influence of a blast, as a major undesired effect, is determined by many national standards, recommendations and calculations where the main parameter is ground oscillation velocity at the field measurement location. There are a few approaches and methods for calculation of expected ground oscillation velocities according to charge weight per delay and the distance from the blast to the point of interest. Utilizations of these methods and formulas do not provide satisfactory results, thus the measured values on diverse distance from the blast field more or less differ from values given by previous calculations. Since blasting works are executed in diverse geological conditions, the aim of this research is the development of a practical and reliable approach which will give a different model for each construction site where blasting works have been or will be executed. The approach is based on a greater number of measuring points in line from the blast field at predetermined distances. This new approach has been compared with other generally used methods and formulas through the use of measurements taken during research along with measurements from several previously executed projects. The results confirmed that the suggested model gives more accurate values.

  7. An artificial neural network approach for aerodynamic performance retention in airframe noise reduction design of a 3D swept wing model

    Directory of Open Access Journals (Sweden)

    Tao Jun

    2016-10-01

    Full Text Available With the progress of high-bypass turbofan and the innovation of silencing nacelle in engine noise reduction, airframe noise has now become another important sound source besides the engine noise. Thus, reducing airframe noise makes a great contribution to the overall noise reduction of a civil aircraft. However, reducing airframe noise often leads to aerodynamic performance loss in the meantime. In this case, an approach based on artificial neural network is introduced. An established database serves as a basis and the training sample of a back propagation (BP artificial neural network, which uses confidence coefficient reasoning method for optimization later on. Then the most satisfactory configuration is selected for validating computations through the trained BP network. On the basis of the artificial neural network approach, an optimization process of slat cove filler (SCF for high lift devices (HLD on the Trap Wing is presented. Aerodynamic performance of both the baseline and optimized configurations is investigated through unsteady detached eddy simulations (DES, and a hybrid method, which combines unsteady DES method with acoustic analogy theory, is employed to validate the noise reduction effect. The numerical results indicate not merely a significant airframe noise reduction effect but also excellent aerodynamic performance retention simultaneously.

  8. Investigation of Thermal Performance for Atria: a Method Overview

    Directory of Open Access Journals (Sweden)

    Moosavi Leila

    2016-01-01

    Full Text Available The importance of low energy design in large buildings has encouraged researchers to implement different methods for predicting a building’s thermal performance. Atria, as energy efficient features, have been implemented to improve the indoor thermal environment in large modern buildings. Though widely implemented, the thorough study of atrium performance is restricted due to its large size, complex thermodynamic behavior and the inaccuracies and limitations of available prediction tools. This study reviews the most common research tools implemented in previous researches on atria thermal performance, to explore the advantages and limitation of different methods for future studies. The methods reviewed are analytical, experimental, computer modelling and a combination of any or all of these methods. The findings showed that CFD (computational fluid dynamic models are the most popular tools of recent due to their higher accuracy, capabilities and user-friendly modification. Although the experimental methods were reliable for predicting atria thermal and ventilation performance, they have mostly been used to provide data for validation of CFD models. Furthermore, coupling CFD with other experimental models could increase the reliability and accuracy of the models and provide a more comprehensive analysis.

  9. Easy and difficult performance-approach goals : Their moderating effect on the link between task interest and performance attainment

    NARCIS (Netherlands)

    Blaga, Monica; Van Yperen, N.W.

    2008-01-01

    The purpose of this study was to demonstrate that the positive link between task interest and performance attainment can he negatively affected by the pursuit of difficult performance-approach goals. This was tested in a sample of 60 undergraduate Students at a Dutch university, In line with

  10. An Integrated Lumped Parameter-CFD approach for off-design ejector performance evaluation

    International Nuclear Information System (INIS)

    Besagni, Giorgio; Mereu, Riccardo; Chiesa, Paolo; Inzoli, Fabio

    2015-01-01

    Highlights: • We validate a CFD approach for a convergent nozzle ejector using global and local measurement. • We evaluate seven RANS turbulence models for convergent nozzle ejector. • We introduce a lumped parameter model for on-design and off-design ejector performance evaluation. • We analyze the relationship between local flow behavior and lumped parameters of the model. • We discuss how to improve predicting capabilities of the model by variable parameters calibrated on CFD simulations. - Abstract: This paper presents an Integrated Lumped Parameter Model-Computational Fluid-Dynamics approach for off-design ejector performance evaluation. The purpose of this approach is to evaluate the entrainment ratio, for a fixed geometry, in both on-design and off-design operating conditions. The proposed model is based on a Lumped Parameter Model (LPM) with variable ejector component efficiencies provided by CFD simulations. The CFD results are used for developing maps for ejector component efficiencies in a broad range of operating conditions. The ejector component efficiency maps couple the CFD and the LPM techniques for building an Integrated LPM-CFD approach. The proposed approach is demonstrated for a convergent nozzle ejector and the paper is structured in four parts. At first, the CFD approach is validated by global and local data and seven Reynolds Averaged Navier Stokes (RANS) turbulence models are compared: the k–ω SST showed good performance and was selected for the rest of the analysis. At second, a Lumped Parameter Model (LPM) for subsonic ejector is developed and the ejector component efficiencies have been defined. At third, the CFD approach is used to investigate the flow field, to analyze its influence on ejector component efficiencies and to propose efficiency correlations and maps linking ejector component efficiencies and local flow quantities. In the last part, the efficiency maps are embedded into the lumped parameter model, thus creating

  11. Personality, Assessment Methods and Academic Performance

    Science.gov (United States)

    Furnham, Adrian; Nuygards, Sarah; Chamorro-Premuzic, Tomas

    2013-01-01

    This study examines the relationship between personality and two different academic performance (AP) assessment methods, namely exams and coursework. It aimed to examine whether the relationship between traits and AP was consistent across self-reported versus documented exam results, two different assessment techniques and across different…

  12. Knowledge-based biomedical word sense disambiguation: comparison of approaches

    Directory of Open Access Journals (Sweden)

    Aronson Alan R

    2010-11-01

    Full Text Available Abstract Background Word sense disambiguation (WSD algorithms attempt to select the proper sense of ambiguous terms in text. Resources like the UMLS provide a reference thesaurus to be used to annotate the biomedical literature. Statistical learning approaches have produced good results, but the size of the UMLS makes the production of training data infeasible to cover all the domain. Methods We present research on existing WSD approaches based on knowledge bases, which complement the studies performed on statistical learning. We compare four approaches which rely on the UMLS Metathesaurus as the source of knowledge. The first approach compares the overlap of the context of the ambiguous word to the candidate senses based on a representation built out of the definitions, synonyms and related terms. The second approach collects training data for each of the candidate senses to perform WSD based on queries built using monosemous synonyms and related terms. These queries are used to retrieve MEDLINE citations. Then, a machine learning approach is trained on this corpus. The third approach is a graph-based method which exploits the structure of the Metathesaurus network of relations to perform unsupervised WSD. This approach ranks nodes in the graph according to their relative structural importance. The last approach uses the semantic types assigned to the concepts in the Metathesaurus to perform WSD. The context of the ambiguous word and semantic types of the candidate concepts are mapped to Journal Descriptors. These mappings are compared to decide among the candidate concepts. Results are provided estimating accuracy of the different methods on the WSD test collection available from the NLM. Conclusions We have found that the last approach achieves better results compared to the other methods. The graph-based approach, using the structure of the Metathesaurus network to estimate the relevance of the Metathesaurus concepts, does not perform well

  13. Statistical multi-model approach for performance assessment of cooling tower

    International Nuclear Information System (INIS)

    Pan, Tian-Hong; Shieh, Shyan-Shu; Jang, Shi-Shang; Tseng, Wen-Hung; Wu, Chan-Wei; Ou, Jenq-Jang

    2011-01-01

    This paper presents a data-driven model-based assessment strategy to investigate the performance of a cooling tower. In order to achieve this objective, the operations of a cooling tower are first characterized using a data-driven method, multiple models, which presents a set of local models in the format of linear equations. Satisfactory fuzzy c-mean clustering algorithm is used to classify operating data into several groups to build local models. The developed models are then applied to predict the performance of the system based on design input parameters provided by the manufacturer. The tower characteristics are also investigated using the proposed models via the effects of the water/air flow ratio. The predicted results tend to agree well with the calculated tower characteristics using actual measured operating data from an industrial plant. By comparison with the design characteristic curve provided by the manufacturer, the effectiveness of cooling tower can be obtained in the end. A case study conducted in a commercial plant demonstrates the validity of proposed approach. It should be noted that this is the first attempt to assess the cooling efficiency which is deviated from the original design value using operating data for an industrial scale process. Moreover, the evaluated process need not interrupt the normal operation of the cooling tower. This should be of particular interest in industrial applications.

  14. Hourly forecasting of global solar radiation based on multiscale decomposition methods: A hybrid approach

    International Nuclear Information System (INIS)

    Monjoly, Stéphanie; André, Maïna; Calif, Rudy; Soubdhan, Ted

    2017-01-01

    This paper introduces a new approach for the forecasting of solar radiation series at 1 h ahead. We investigated on several techniques of multiscale decomposition of clear sky index K_c data such as Empirical Mode Decomposition (EMD), Ensemble Empirical Mode Decomposition (EEMD) and Wavelet Decomposition. From these differents methods, we built 11 decomposition components and 1 residu signal presenting different time scales. We performed classic forecasting models based on linear method (Autoregressive process AR) and a non linear method (Neural Network model). The choice of forecasting method is adaptative on the characteristic of each component. Hence, we proposed a modeling process which is built from a hybrid structure according to the defined flowchart. An analysis of predictive performances for solar forecasting from the different multiscale decompositions and forecast models is presented. From multiscale decomposition, the solar forecast accuracy is significantly improved, particularly using the wavelet decomposition method. Moreover, multistep forecasting with the proposed hybrid method resulted in additional improvement. For example, in terms of RMSE error, the obtained forecasting with the classical NN model is about 25.86%, this error decrease to 16.91% with the EMD-Hybrid Model, 14.06% with the EEMD-Hybid model and to 7.86% with the WD-Hybrid Model. - Highlights: • Hourly forecasting of GHI in tropical climate with many cloud formation processes. • Clear sky Index decomposition using three multiscale decomposition methods. • Combination of multiscale decomposition methods with AR-NN models to predict GHI. • Comparison of the proposed hybrid model with the classical models (AR, NN). • Best results using Wavelet-Hybrid model in comparison with classical models.

  15. Diagnostic Performance of a Rapid Magnetic Resonance Imaging Method of Measuring Hepatic Steatosis

    Science.gov (United States)

    House, Michael J.; Gan, Eng K.; Adams, Leon A.; Ayonrinde, Oyekoya T.; Bangma, Sander J.; Bhathal, Prithi S.; Olynyk, John K.; St. Pierre, Tim G.

    2013-01-01

    Objectives Hepatic steatosis is associated with an increased risk of developing serious liver disease and other clinical sequelae of the metabolic syndrome. However, visual estimates of steatosis from histological sections of biopsy samples are subjective and reliant on an invasive procedure with associated risks. The aim of this study was to test the ability of a rapid, routinely available, magnetic resonance imaging (MRI) method to diagnose clinically relevant grades of hepatic steatosis in a cohort of patients with diverse liver diseases. Materials and Methods Fifty-nine patients with a range of liver diseases underwent liver biopsy and MRI. Hepatic steatosis was quantified firstly using an opposed-phase, in-phase gradient echo, single breath-hold MRI methodology and secondly, using liver biopsy with visual estimation by a histopathologist and by computer-assisted morphometric image analysis. The area under the receiver operating characteristic (ROC) curve was used to assess the diagnostic performance of the MRI method against the biopsy observations. Results The MRI approach had high sensitivity and specificity at all hepatic steatosis thresholds. Areas under ROC curves were 0.962, 0.993, and 0.972 at thresholds of 5%, 33%, and 66% liver fat, respectively. MRI measurements were strongly associated with visual (r2 = 0.83) and computer-assisted morphometric (r2 = 0.84) estimates of hepatic steatosis from histological specimens. Conclusions This MRI approach, using a conventional, rapid, gradient echo method, has high sensitivity and specificity for diagnosing liver fat at all grades of steatosis in a cohort with a range of liver diseases. PMID:23555650

  16. The effect of increasing strength and approach velocity on triple jump performance.

    Science.gov (United States)

    Allen, Sam J; Yeadon, M R Fred; King, Mark A

    2016-12-08

    The triple jump is an athletic event comprising three phases in which the optimal phase ratio (the proportion of each phase to the total distance jumped) is unknown. This study used a planar whole body torque-driven computer simulation model of the ground contact parts of all three phases of the triple jump to investigate the effect of strength and approach velocity on optimal performance. The strength and approach velocity of the simulation model were each increased by up to 30% in 10% increments from baseline data collected from a national standard triple jumper. Increasing strength always resulted in an increased overall jump distance. Increasing approach velocity also typically resulted in an increased overall jump distance but there was a point past which increasing approach velocity without increasing strength did not lead to an increase in overall jump distance. Increasing both strength and approach velocity by 10%, 20%, and 30% led to roughly equivalent increases in overall jump distances. Distances ranged from 14.05m with baseline strength and approach velocity, up to 18.49m with 30% increases in both. Optimal phase ratios were either hop-dominated or balanced, and typically became more balanced when the strength of the model was increased by a greater percentage than its approach velocity. The range of triple jump distances that resulted from the optimisation process suggests that strength and approach velocity are of great importance for triple jump performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Energy management in production: A novel method to develop key performance indicators for improving energy efficiency

    International Nuclear Information System (INIS)

    May, Gökan; Barletta, Ilaria; Stahl, Bojan; Taisch, Marco

    2015-01-01

    Highlights: • We propose a 7-step methodology to develop firm-tailored energy-related KPIs (e-KPIs). • We provide a practical guide for companies to identify their most important e-KPIs. • e-KPIs support identification of energy efficiency improvement areas in production. • The method employs an action plan for achieving energy saving targets. • The paper strengthens theoretical base for energy-based decision making in manufacturing. - Abstract: Measuring energy efficiency performance of equipments, processes and factories is the first step to effective energy management in production. Thus, enabled energy-related information allows the assessment of the progress of manufacturing companies toward their energy efficiency goals. In that respect, the study addresses this challenge where current industrial approaches lack the means and appropriate performance indicators to compare energy-use profiles of machines and processes, and for the comparison of their energy efficiency performance to that of competitors’. Focusing on this challenge, the main objective of the paper is to present a method which supports manufacturing companies in the development of energy-based performance indicators. For this purpose, we provide a 7-step method to develop production-tailored and energy-related key performance indicators (e-KPIs). These indicators allow the interpretation of cause-effect relationships and therefore support companies in their operative decision-making process. Consequently, the proposed method supports the identification of weaknesses and areas for energy efficiency improvements related to the management of production and operations. The study therefore aims to strengthen the theoretical base necessary to support energy-based decision making in manufacturing industries

  18. Performing dynamic time history analyses by extension of the response spectrum method

    International Nuclear Information System (INIS)

    Hulbert, G.M.

    1983-01-01

    A method is presented to calculate the dynamic time history response of finite-element models using results from response spectrum analyses. The proposed modified time history method does not represent a new mathamatical approach to dynamic analysis but suggests a more efficient ordering of the analytical equations and procedures. The modified time history method is considerably faster and less expensive to use than normal time hisory methods. This paper presents the theory and implementation of the modified time history approach along with comparisons of the modified and normal time history methods for a prototypic seismic piping design problem

  19. Resource Isolation Method for Program’S Performance on CMP

    Science.gov (United States)

    Guan, Ti; Liu, Chunxiu; Xu, Zheng; Li, Huicong; Ma, Qiang

    2017-10-01

    Data center and cloud computing are more popular, which make more benefits for customers and the providers. However, in data center or clusters, commonly there is more than one program running on one server, but programs may interference with each other. The interference may take a little effect, however, the interference may cause serious drop down of performance. In order to avoid the performance interference problem, the mechanism of isolate resource for different programs is a better choice. In this paper we propose a light cost resource isolation method to improve program’s performance. This method uses Cgroups to set the dedicated CPU and memory resource for a program, aiming at to guarantee the program’s performance. There are three engines to realize this method: Program Monitor Engine top program’s resource usage of CPU and memory and transfer the information to Resource Assignment Engine; Resource Assignment Engine calculates the size of CPU and memory resource should be applied for the program; Cgroups Control Engine divide resource by Linux tool Cgroups, and drag program in control group for execution. The experiment result show that making use of the resource isolation method proposed by our paper, program’s performance can be improved.

  20. Research Methods for Business : A Skill Building Approach (5th Edition)

    NARCIS (Netherlands)

    Sekaran, U.; Bougie, J.R.G.

    2009-01-01

    Research Methods for Business: A Skill-Building Approach is a concise and straightforward introduction for students to the world of business research. The skill-building approach provides students with practical perspectives on how research can be applied in real business situations. Maintaining Uma

  1. Analysis of international approaches which are used at development of theoperational safety performance indicators

    International Nuclear Information System (INIS)

    Lyigots'kij, O.Yi.; Nosovs'kij, A.V.; Chemeris, Yi.O.

    2009-01-01

    Description of international approaches and experience of the use of theoperational safety performance indicators system is provided for estimationof current status and making a decision on corrections in the operationpractice. The state of development of the operational safety performanceindicators system by the operating organization is overviewed. Thepossibility of application of international approaches during development ofthe integral safety performance indicators system is analyzed. Aims and tasksof future researches are formulated in relation to development of theintegral safety performance indicators system.

  2. A New Approach to Performing Bundle Adjustment for Time Series UAV Images 3D Building Change Detection

    Directory of Open Access Journals (Sweden)

    Wenzhuo Li

    2017-06-01

    Full Text Available Successful change detection in multi-temporal images relies on high spatial co-registration accuracy. However, co-registration accuracy alone cannot meet the needs of change detection when using several ground control points to separately geo-reference multi-temporal images from unmanned aerial vehicles (UAVs. This letter reports on a new approach to perform bundle adjustment—named united bundle adjustment (UBA—to solve this co-registration problem for change detection in multi-temporal UAV images. In UBA, multi-temporal UAV images are matched with each other to construct a unified tie point net. One single bundle adjustment process is performed on the unified tie point net, placing every image into the same coordinate system and thus automatically accomplishing spatial co-registration. We then perform change detection using both orthophotos and three-dimensional height information derived from dense image matching techniques. Experimental results show that UBA co-registration accuracy is higher than the accuracy of commonly-used approaches for multi-temporal UAV images. Our proposed preprocessing method extends the capacities of consumer-level UAVs so they can eventually meet the growing need for automatic building change detection and dynamic monitoring using only RGB band images.

  3. A Generalized Pivotal Quantity Approach to Analytical Method Validation Based on Total Error.

    Science.gov (United States)

    Yang, Harry; Zhang, Jianchun

    2015-01-01

    The primary purpose of method validation is to demonstrate that the method is fit for its intended use. Traditionally, an analytical method is deemed valid if its performance characteristics such as accuracy and precision are shown to meet prespecified acceptance criteria. However, these acceptance criteria are not directly related to the method's intended purpose, which is usually a gurantee that a high percentage of the test results of future samples will be close to their true values. Alternate "fit for purpose" acceptance criteria based on the concept of total error have been increasingly used. Such criteria allow for assessing method validity, taking into account the relationship between accuracy and precision. Although several statistical test methods have been proposed in literature to test the "fit for purpose" hypothesis, the majority of the methods are not designed to protect the risk of accepting unsuitable methods, thus having the potential to cause uncontrolled consumer's risk. In this paper, we propose a test method based on generalized pivotal quantity inference. Through simulation studies, the performance of the method is compared to five existing approaches. The results show that both the new method and the method based on β-content tolerance interval with a confidence level of 90%, hereafter referred to as the β-content (0.9) method, control Type I error and thus consumer's risk, while the other existing methods do not. It is further demonstrated that the generalized pivotal quantity method is less conservative than the β-content (0.9) method when the analytical methods are biased, whereas it is more conservative when the analytical methods are unbiased. Therefore, selection of either the generalized pivotal quantity or β-content (0.9) method for an analytical method validation depends on the accuracy of the analytical method. It is also shown that the generalized pivotal quantity method has better asymptotic properties than all of the current

  4. Integrated Approach Towards the Application of Horizontal Wells to Improve Waterflooding Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kelkar, Mohan; Liner, Chris; Kerr, Dennis

    1999-10-15

    This final report describes the progress during the six year of the project on ''Integrated Approach Towards the Application of Horizontal Wells to Improve Waterflooding Performance.'' This report is funded under the Department of Energy's (DOE's) Class I program which is targeted towards improving the reservoir performance of mature oil fields located in fluvially-dominated deltaic deposits. The project involves using an integrated approach to characterize the reservoir followed by drilling of horizontal injection wells to improve production performance. The project was divided into two budget periods. In the first budget period, many modern technologies were used to develop a detailed reservoir management plan; whereas, in the second budget period, conventional data was used to develop a reservoir management plan. The idea was to determine the cost effectiveness of various technologies in improving the performance of mature oil fields.

  5. Roka Listeria detection method using transcription mediated amplification to detect Listeria species in select foods and surfaces. Performance Tested Method(SM) 011201.

    Science.gov (United States)

    Hua, Yang; Kaplan, Shannon; Reshatoff, Michael; Hu, Ernie; Zukowski, Alexis; Schweis, Franz; Gin, Cristal; Maroni, Brett; Becker, Michael; Wisniewski, Michele

    2012-01-01

    The Roka Listeria Detection Assay was compared to the reference culture methods for nine select foods and three select surfaces. The Roka method used Half-Fraser Broth for enrichment at 35 +/- 2 degrees C for 24-28 h. Comparison of Roka's method to reference methods requires an unpaired approach. Each method had a total of 545 samples inoculated with a Listeria strain. Each food and surface was inoculated with a different strain of Listeria at two different levels per method. For the dairy products (Brie cheese, whole milk, and ice cream), our method was compared to AOAC Official Method(SM) 993.12. For the ready-to-eat meats (deli chicken, cured ham, chicken salad, and hot dogs) and environmental surfaces (sealed concrete, stainless steel, and plastic), these samples were compared to the U.S. Department of Agriculture/Food Safety and Inspection Service-Microbiology Laboratory Guidebook (USDA/FSIS-MLG) method MLG 8.07. Cold-smoked salmon and romaine lettuce were compared to the U.S. Food and Drug Administration/Bacteriological Analytical Manual, Chapter 10 (FDA/BAM) method. Roka's method had 358 positives out of 545 total inoculated samples compared to 332 positive for the reference methods. Overall the probability of detection analysis of the results showed better or equivalent performance compared to the reference methods.

  6. Performance assessment of semiempirical molecular orbital methods in describing halogen bonding: quantum mechanical and quantum mechanical/molecular mechanical-molecular dynamics study.

    Science.gov (United States)

    Ibrahim, Mahmoud A A

    2011-10-24

    The performance of semiempirical molecular-orbital methods--MNDO, MNDO-d, AM1, RM1, PM3 and PM6--in describing halogen bonding was evaluated, and the results were compared with molecular mechanical (MM) and quantum mechanical (QM) data. Three types of performance were assessed: (1) geometrical optimizations and binding energy calculations for 27 halogen-containing molecules complexed with various Lewis bases (Two of the tested methods, AM1 and RM1, gave results that agree with the QM data.); (2) charge distribution calculations for halobenzene molecules, determined by calculating the solvation free energies of the molecules relative to benzene in explicit and implicit generalized Born (GB) solvents (None of the methods gave results that agree with the experimental data.); and (3) appropriateness of the semiempirical methods in the hybrid quantum-mechanical/molecular-mechanical (QM/MM) scheme, investigated by studying the molecular inhibition of CK2 protein by eight halobenzimidazole and -benzotriazole derivatives using hybrid QM/MM molecular-dynamics (MD) simulations with the inhibitor described at the QM level by the AM1 method and the rest of the system described at the MM level. The pure MM approach with inclusion of an extra point of positive charge on the halogen atom approach gave better results than the hybrid QM/MM approach involving the AM1 method. Also, in comparison with the pure MM-GBSA (generalized Born surface area) binding energies and experimental data, the calculated QM/MM-GBSA binding energies of the inhibitors were improved by replacing the G(GB,QM/MM) solvation term with the corresponding G(GB,MM) term.

  7. The balanced scorecard: an integrative approach to performance evaluation.

    Science.gov (United States)

    Oliveira, J

    2001-05-01

    In addition to strict financial outcomes, healthcare financial managers should assess intangible assets that affect the organization's bottom line, such as clinical processes, staff skills, and patient satisfaction and loyalty. The balanced scorecard, coupled with data-warehousing capabilities, offers a way to measure an organization's performance against its strategic objectives while focusing on building capabilities to achieve these objectives. The balanced scorecard examines performance related to finance, human resources, internal processes, and customers. Because the balanced scorecard requires substantial amounts of data, it is a necessity to establish an organizational data warehouse of clinical, operational, and financial data that can be used in decision support. Because it presents indicators that managers and staff can influence directly by their actions, the balanced-scorecard approach to performance measurement encourages behavioral changes aimed at achieving corporate strategies.

  8. REVIEW OF MECHANISTIC UNDERSTANDING AND MODELING AND UNCERTAINTY ANALYSIS METHODS FOR PREDICTING CEMENTITIOUS BARRIER PERFORMANCE

    Energy Technology Data Exchange (ETDEWEB)

    Langton, C.; Kosson, D.

    2009-11-30

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focus of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various

  9. Review Of Mechanistic Understanding And Modeling And Uncertainty Analysis Methods For Predicting Cementitious Barrier Performance

    International Nuclear Information System (INIS)

    Langton, C.; Kosson, D.

    2009-01-01

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focus of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various

  10. Agile Service Development: A Rule-Based Method Engineering Approach

    NARCIS (Netherlands)

    dr. Martijn Zoet; Stijn Hoppenbrouwers; Inge van de Weerd; Johan Versendaal

    2011-01-01

    Agile software development has evolved into an increasingly mature software development approach and has been applied successfully in many software vendors’ development departments. In this position paper, we address the broader agile service development. Based on method engineering principles we

  11. The Impact of a Multifaceted Approach to Teaching Research Methods on Students' Attitudes

    Science.gov (United States)

    Ciarocco, Natalie J.; Lewandowski, Gary W., Jr.; Van Volkom, Michele

    2013-01-01

    A multifaceted approach to teaching five experimental designs in a research methodology course was tested. Participants included 70 students enrolled in an experimental research methods course in the semester both before and after the implementation of instructional change. When using a multifaceted approach to teaching research methods that…

  12. Link Performance Analysis and monitoring - A unified approach to divergent requirements

    Science.gov (United States)

    Thom, G. A.

    Link Performance Analysis and real-time monitoring are generally covered by a wide range of equipment. Bit Error Rate testers provide digital link performance measurements but are not useful during real-time data flows. Real-time performance monitors utilize the fixed overhead content but vary widely from format to format. Link quality information is also present from signal reconstruction equipment in the form of receiver AGC, bit synchronizer AGC, and bit synchronizer soft decision level outputs, but no general approach to utilizing this information exists. This paper presents an approach to link tests, real-time data quality monitoring, and results presentation that utilizes a set of general purpose modules in a flexible architectural environment. The system operates over a wide range of bit rates (up to 150 Mbs) and employs several measurement techniques, including P/N code errors or fixed PCM format errors, derived real-time BER from frame sync errors, and Data Quality Analysis derived by counting significant sync status changes. The architecture performs with a minimum of elements in place to permit a phased update of the user's unit in accordance with his needs.

  13. Parallel performance and accuracy of lattice Boltzmann and traditional finite difference methods for solving the unsteady two-dimensional Burger's equation

    Science.gov (United States)

    Velivelli, A. C.; Bryden, K. M.

    2006-03-01

    Lattice Boltzmann methods are gaining recognition in the field of computational fluid dynamics due to their computational efficiency. In order to quantify the computational efficiency and accuracy of the lattice Boltzmann method, it is compared with efficient traditional finite difference methods such as the alternating direction implicit scheme. The lattice Boltzmann algorithm implemented in previous studies does not approach peak performance for simulations where the data involved in computation per time step is more than the cache size. Due to this, data is obtained from the main memory and this access is much slower than access to cache memory. Using a cache-optimized lattice Boltzmann algorithm, this paper takes into account the full computational strength of the lattice Boltzmann method. The com parison is performed on both a single processor and multiple processors.

  14. A new approach to the Darboux-Baecklund transformation versus the standard dressing method

    International Nuclear Information System (INIS)

    Cieslinski, Jan L; Biernacki, Waldemar

    2005-01-01

    We present a new approach to the construction of the Darboux matrix. This is a generalization of a recently formulated method based on the assumption that the square of the Darboux matrix vanishes for some values of the spectral parameter. We consider the multisoliton case, the reduction problem and the discrete case. The relationships between our approach, the Zakharov-Shabat dressing method and the Neugebauer-Meinel method are discussed in detail

  15. A linear regression approach to evaluate the green supply chain management impact on industrial organizational performance.

    Science.gov (United States)

    Mumtaz, Ubaidullah; Ali, Yousaf; Petrillo, Antonella

    2018-05-15

    The increase in the environmental pollution is one of the most important topic in today's world. In this context, the industrial activities can pose a significant threat to the environment. To manage problems associate to industrial activities several methods, techniques and approaches have been developed. Green supply chain management (GSCM) is considered one of the most important "environmental management approach". In developing countries such as Pakistan the implementation of GSCM practices is still in its initial stages. Lack of knowledge about its effects on economic performance is the reason because of industries fear to implement these practices. The aim of this research is to perceive the effects of GSCM practices on organizational performance in Pakistan. In this research the GSCM practices considered are: internal practices, external practices, investment recovery and eco-design. While, the performance parameters considered are: environmental pollution, operational cost and organizational flexibility. A set of hypothesis propose the effect of each GSCM practice on the performance parameters. Factor analysis and linear regression are used to analyze the survey data of Pakistani industries, in order to authenticate these hypotheses. The findings of this research indicate a decrease in environmental pollution and operational cost with the implementation of GSCM practices, whereas organizational flexibility has not improved for Pakistani industries. These results aim to help managers regarding their decision of implementing GSCM practices in the industrial sector of Pakistan. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Buffer-Free High Performance Liquid Chromatography Method for ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a simple, economical and reproducible high performance liquid chromatographic (HPLC) method for the determination of theophylline in pharmaceutical dosage forms. Method: Caffeine was used as the internal standard and reversed phase C-18 column was used to elute the drug and ...

  17. Magnetotomography—a new method for analysing fuel cell performance and quality

    Science.gov (United States)

    Hauer, Karl-Heinz; Potthast, Roland; Wüster, Thorsten; Stolten, Detlef

    Magnetotomography is a new method for the measurement and analysis of the current density distribution of fuel cells. The method is based on the measurement of the magnetic flux surrounding the fuel cell stack caused by the current inside the stack. As it is non-invasive, magnetotomography overcomes the shortcomings of traditional methods for the determination of current density in fuel cells [J. Stumper, S.A. Campell, D.P. Wilkinson, M.C. Johnson, M. Davis, In situ methods for the determination of current distributions in PEM fuel cells, Electrochem. Acta 43 (1998) 3773; S.J.C. Cleghorn, C.R. Derouin, M.S. Wilson, S. Gottesfeld, A printed circuit board approach to measuring current distribution in a fuel cell, J. Appl. Electrochem. 28 (1998) 663; Ch. Wieser, A. Helmbold, E. Gülzow, A new technique for two-dimensional current distribution measurements in electro-chemical cells, J. Appl. Electrochem. 30 (2000) 803; Grinzinger, Methoden zur Ortsaufgelösten Strommessung in Polymer Elektrolyt Brennstoffzellen, Diploma thesis, TU-München, 2003; Y.-G. Yoon, W.-Y. Lee, T.-H. Yang, G.-G. Park, C.-S. Kim, Current distribution in a single cell of PEMFC, J. Power Sources 118 (2003) 193-199; M.M. Mench, C.Y. Wang, An in situ method for determination of current distribution in PEM fuel cells applied to a direct methanol fuel cell, J. Electrochem. Soc. 150 (2003) A79-A85; S. Schönbauer, T. Kaz, H. Sander, E. Gülzow, Segmented bipolar plate for the determination of current distribution in polymer electrolyte fuel cells, in: Proceedings of the Second European PEMFC Forum, vol. 1, Lucerne/Switzerland, 2003, pp. 231-237; G. Bender, S.W. Mahlon, T.A. Zawodzinski, Further refinements in the segmented cell approach to diagnosing performance in polymer electrolyte fuel cells, J. Power Sources 123 (2003) 163-171]. After several years of research a complete prototype system is now available for research on single cells and stacks. This paper describes the basic system (fundamentals

  18. Advanced fabrication method for the preparation of MOF thin films: Liquid-phase epitaxy approach meets spin coating method.

    KAUST Repository

    Chernikova, Valeriya

    2016-07-14

    Here we report a new and advanced method for the fabrication of highly oriented/polycrystalline metal-organic framework (MOF) thin films. Building on the attractive features of the liquid-phase epitaxy (LPE) approach, a facile spin coating method was implemented to generate MOF thin films in a high-throughput fashion. Advantageously, this approach offers a great prospective to cost-effectively construct thin-films with a significantly shortened preparation time and a lessened chemicals and solvents consumption, as compared to the conventional LPE-process. Certainly, this new spin-coating approach has been implemented successfully to construct various MOF thin films, ranging in thickness from a few micrometers down to the nanometer scale, spanning 2-D and 3-D benchmark MOF materials including Cu2(bdc)2•xH2O, Zn2(bdc)2•xH2O, HKUST-1 and ZIF-8. This method was appraised and proved effective on a variety of substrates comprising functionalized gold, silicon, glass, porous stainless steel and aluminum oxide. The facile, high-throughput and cost-effective nature of this approach, coupled with the successful thin film growth and substrate versatility, represents the next generation of methods for MOF thin film fabrication. Thereby paving the way for these unique MOF materials to address a wide range of challenges in the areas of sensing devices and membrane technology.

  19. Sustainable Supplier Performance Evaluation and Selection with Neofuzzy TOPSIS Method.

    Science.gov (United States)

    Chaharsooghi, S K; Ashrafi, Mehdi

    2014-01-01

    Supplier selection plays an important role in the supply chain management and traditional criteria such as price, quality, and flexibility are considered for supplier performance evaluation in researches. In recent years sustainability has received more attention in the supply chain management literature with triple bottom line (TBL) describing the sustainability in supply chain management with social, environmental, and economic initiatives. This paper explores sustainability in supply chain management and examines the problem of identifying a new model for supplier selection based on extended model of TBL approach in supply chain by presenting fuzzy multicriteria method. Linguistic values of experts' subjective preferences are expressed with fuzzy numbers and Neofuzzy TOPSIS is proposed for finding the best solution of supplier selection problem. Numerical results show that the proposed model is efficient for integrating sustainability in supplier selection problem. The importance of using complimentary aspects of sustainability and Neofuzzy TOPSIS concept in sustainable supplier selection process is shown with sensitivity analysis.

  20. A method to evaluate process performance by integrating time and resources

    Science.gov (United States)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  1. Methods of Evaluating Performances for Marketing Strategies

    OpenAIRE

    Ioan Cucu

    2005-01-01

    There are specific methods for assessing and improving the effectiveness of a marketing strategy. A marketer should state in the marketing plan what a marketing strategy is supposed to accomplish. These statements should set forth performance standards, which usually are stated in terms of profits, sales, or costs. Actual performance must be measured in similar terms so that comparisons are possible. This paper describes sales analysis and cost analysis, two general ways of evaluating the act...

  2. A Literature Review Fuzzy Pay-Off-Method – A Modern Approach in Valuation

    Directory of Open Access Journals (Sweden)

    Daniel Manaţe

    2015-01-01

    Full Text Available This article proposes to present a modern approach in the analysis of updated cash flows. The approach is based on the Fuzzy Pay-Off-Method (FPOM for Real Option Valuation (ROV. This article describes a few types of models for the valuation of real options currently in use. In support for the chosen FPOM method, we included the mathematical model that stands at the basis of this method and a case study.

  3. Activity Based Costing (ABC as an Approach to Optimize Purchasing Performance in Hospitality Industry

    Directory of Open Access Journals (Sweden)

    Mohamed S. El-Deeb

    2011-07-01

    Full Text Available ABC (Activity Based Costing system has proved success in both products and services. The researchers propose using a new model through the application of ABC approach that can be implemented in purchasing department as one of the most dynamic departments in service sector to optimize purchasing activities performance. The researchers propose purchasing measures, targeting customers’ loyalty ensuring the continuous flow of supplies. The researchers used the questionnaire as a tool of data collection method for verifying the hypothesis of the research. Data obtained was analyzed by using Statistical Package for Social Sciences (SPSS. The results of the research based on limited survey that have been distributed to number of hotels in Great Cairo region. Our research was targeting three hundred purchasing manager and staff through five star hotels. It is recognized that further research is necessary to establish the exact nature of the causal linkages between proposed performance measures and strategic intent in order to gain insights into practice elsewhere.

  4. A high-performance liquid chromatography method for the serotonin release assay is equivalent to the radioactive method.

    Science.gov (United States)

    Sono-Koree, N K; Crist, R A; Frank, E L; Rodgers, G M; Smock, K J

    2016-02-01

    The serotonin release assay (SRA) is considered the gold standard laboratory test for heparin-induced thrombocytopenia (HIT). The historic SRA method uses platelets loaded with radiolabeled serotonin to evaluate platelet activation by HIT immune complexes. However, a nonradioactive method is desirable. We report the performance characteristics of a high-performance liquid chromatography (HPLC) SRA method. We validated the performance characteristics of an HPLC-SRA method, including correlation with a reference laboratory using the radioactive method. Serotonin released from reagent platelets was quantified by HPLC using fluorescent detection. Results were expressed as % release and classified as positive, negative, or indeterminate based on previously published cutoffs. Serum samples from 250 subjects with suspected HIT were tested in the HPLC-SRA and with the radioactive method. Concordant classifications were observed in 230 samples (92%). Sera from 41 healthy individuals tested negative. Between-run imprecision studies showed standard deviation of performance characteristics, equivalent to the historic radioactive method, but avoids the complexities of working with radioactivity. © 2015 John Wiley & Sons Ltd.

  5. Operator performance evaluation using multi criteria decision making methods

    Science.gov (United States)

    Rani, Ruzanita Mat; Ismail, Wan Rosmanira; Razali, Siti Fatihah

    2014-06-01

    Operator performance evaluation is a very important operation in labor-intensive manufacturing industry because the company's productivity depends on the performance of its operators. The aims of operator performance evaluation are to give feedback to operators on their performance, to increase company's productivity and to identify strengths and weaknesses of each operator. In this paper, six multi criteria decision making methods; Analytical Hierarchy Process (AHP), fuzzy AHP (FAHP), ELECTRE, PROMETHEE II, Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR) are used to evaluate the operators' performance and to rank the operators. The performance evaluation is based on six main criteria; competency, experience and skill, teamwork and time punctuality, personal characteristics, capability and outcome. The study was conducted at one of the SME food manufacturing companies in Selangor. From the study, it is found that AHP and FAHP yielded the "outcome" criteria as the most important criteria. The results of operator performance evaluation showed that the same operator is ranked the first using all six methods.

  6. Modeling Nanoscale FinFET Performance by a Neural Network Method

    Directory of Open Access Journals (Sweden)

    Jin He

    2017-07-01

    Full Text Available This paper presents a neural network method to model nanometer FinFET performance. The principle of this method is firstly introduced and its application in modeling DC and conductance characteristics of nanoscale FinFET transistor is demonstrated in detail. It is shown that this method does not need parameter extraction routine while its prediction of the transistor performance has a small relative error within 1 % compared with measured data, thus this new method is as accurate as the physics based surface potential model.

  7. Performance Approach, Performance Avoidance and Depth of Information Processing: A Fresh Look at Relations between Students' Academic Motivation and Cognition.

    Science.gov (United States)

    Barker, Katrina L.; McInerney, Dennis M.; Dowson, Martin

    2002-01-01

    Examines effects of the motivational approach on the recall of verbal information processed at shallow and deep levels. Explains that students were assigned to a mastery focused condition, performance approach condition, or a control group. Reports that students remembered more stimulus words during cued recall than free recall. Includes…

  8. Higher-order differencing method with a multigrid approach for the solution of the incompressible flow equations at high Reynolds numbers

    International Nuclear Information System (INIS)

    Tzanos, C.P.

    1992-01-01

    A higher-order differencing method was recently proposed for the convection-diffusion equation, which even with a coarse mesh gives oscillation-free solutions that are far more accurate than those of the upwind scheme. In this paper, the performance of this method is investigated in conjunction with the performance of different iterative solvers for the solution of the Navier-Stokes equations in the vorticity-streamfunction formulation for incompressible flow at high Reynolds numbers. Flow in a square cavity with a moving lid was chosen as a model problem. Solvers that performed well at low Re numbers either failed to converge or had a computationally prohibitive convergence rate at high Re numbers. The additive correction method of Settari and Aziz and an iterative incomplete lower and upper (ILU) solver were used in a multigrid approach that performed well in the whole range of Re numbers considered (from 1000 to 10,000) and for uniform as well as nonuniform grids. At high Re numbers, point or line Gauss-Seidel solvers converged with uniform grids, but failed to converge with nonuniform grids

  9. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    Sequential mixed methods research is an effective approach for investigating complex problems, but it has not been extensively used in construction management research. In South Africa, the HIV/AIDS pandemic has seen construction management taking on a vital responsibility since the government called upon the ...

  10. Energy Performance of Buildings - The European Approach to Sustainability

    DEFF Research Database (Denmark)

    Heiselberg, Per

    2006-01-01

    This paper presents the European approach to improve sustainability in the building sector, which has a very high potential for considerable reduction of energy consumption in the coming years. By approving the Energy Performance in Buildings Directive the European Union has taken a strong...... leadership role in promoting energy efficiency in buildings in Europe, that will be the most powerful instrument developed to date for the building sector in Europe....

  11. TH-AB-201-10: Portal Dosimetry with Elekta IViewDose:Performance of the Simplified Commissioning Approach Versus Full Commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Kydonieos, M; Folgueras, A; Florescu, L; Cybulski, T; Marinos, N; Thompson, G; Sayeed, A [Elekta Limited, Crawley, West Sussex (United Kingdom); Rozendaal, R; Olaciregui-Ruiz, I [Netherlands Cancer Institute - Antoni van Leeuwenhoek, Amsterdam, Noord-Holland (Netherlands); Subiel, A; Patallo, I Silvestre [National Physical Laboratory, London (United Kingdom)

    2016-06-15

    Purpose: Elekta recently developed a solution for in-vivo EPID dosimetry (iViewDose, Elekta AB, Stockholm, Sweden) in conjunction with the Netherlands Cancer Institute (NKI). This uses a simplified commissioning approach via Template Commissioning Models (TCMs), consisting of a subset of linac-independent pre-defined parameters. This work compares the performance of iViewDose using a TCM commissioning approach with that corresponding to full commissioning. Additionally, the dose reconstruction based on the simplified commissioning approach is validated via independent dose measurements. Methods: Measurements were performed at the NKI on a VersaHD™ (Elekta AB, Stockholm, Sweden). Treatment plans were generated with Pinnacle 9.8 (Philips Medical Systems, Eindhoven, The Netherlands). A farmer chamber dose measurement and two EPID images were used to create a linac-specific commissioning model based on a TCM. A complete set of commissioning measurements was collected and a full commissioning model was created.The performance of iViewDose based on the two commissioning approaches was compared via a series of set-to-work tests in a slab phantom. In these tests, iViewDose reconstructs and compares EPID to TPS dose for square fields, IMRT and VMAT plans via global gamma analysis and isocentre dose difference. A clinical VMAT plan was delivered to a homogeneous Octavius 4D phantom (PTW, Freiburg, Germany). Dose was measured with the Octavius 1500 array and VeriSoft software was used for 3D dose reconstruction. EPID images were acquired. TCM-based iViewDose and 3D Octavius dose distributions were compared against the TPS. Results: For both the TCM-based and the full commissioning approaches, the pass rate, mean γ and dose difference were >97%, <0.5 and <2.5%, respectively. Equivalent gamma analysis results were obtained for iViewDose (TCM approach) and Octavius for a VMAT plan. Conclusion: iViewDose produces similar results with the simplified and full commissioning

  12. 40 CFR 63.344 - Performance test requirements and test methods.

    Science.gov (United States)

    2010-07-01

    ... electroplating tanks or chromium anodizing tanks. The sampling time and sample volume for each run of Methods 306... Chromium Anodizing Tanks § 63.344 Performance test requirements and test methods. (a) Performance test... Emissions From Decorative and Hard Chromium Electroplating and Anodizing Operations,” appendix A of this...

  13. 46 CFR 57.06-3 - Method of performing production testing.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Method of performing production testing. 57.06-3 Section 57.06-3 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING WELDING AND BRAZING Production Tests § 57.06-3 Method of performing production testing. (a) Except as...

  14. Creative Approaches to Teaching Graduate Research Methods Workshops

    OpenAIRE

    Peter Reilly

    2017-01-01

    Engagement and deeper learning were enhanced by developing several innovative teaching strategies delivered in Research Methods workshops to Graduate Business Students.  Focusing primarily on students adopting a creative approach to formulating a valid research question for undertaking a dissertation successfully. These techniques are applicable to most subject domains to ensure student engagement.  Addressing the various multiple intelligences and learning styles existing within groups while...

  15. Methods and approaches to prediction in the meat industry

    Directory of Open Access Journals (Sweden)

    A. B. Lisitsyn

    2016-01-01

    Full Text Available The modern stage of the agro-industrial complex is characterized by an increasing complexity, intensification of technological processes of complex processing of materials of animal origin also the need for a systematic analysis of the variety of determining factors and relationships between them, complexity of the objective function of product quality and severe restrictions on technological regimes. One of the main tasks that face the employees of the enterprises of the agro-industrial complex, which are engaged in processing biotechnological raw materials, is the further organizational improvement of work at all stages of the food chain, besides an increase in the production volume. The meat industry as a part of the agro-industrial complex has to use the biological raw materials with maximum efficiency, while reducing and even eliminating losses at all stages of processing; rationally use raw material when selecting a type of processing products; steadily increase quality, biological and food value of products; broaden the assortment of manufactured products in order to satisfy increasing consumer requirements and extend the market for their realization in the conditions of uncertainty of external environment, due to the uneven receipt of raw materials, variations in its properties and parameters, limited time sales and fluctuations in demand for products. The challenges facing the meat industry cannot be solved without changes to the strategy for scientific and technological development of the industry. To achieve these tasks, it is necessary to use the prediction as a method of constant improvement of all technological processes and their performance under the rational and optimal regimes, while constantly controlling quality of raw material, semi-prepared products and finished products at all stages of the technological processing by the physico-chemical, physico-mechanical (rheological, microbiological and organoleptic methods. The paper

  16. A practical approach to perform graded verification and validation

    International Nuclear Information System (INIS)

    Terrado, Carlos; Woolley, J.

    2000-01-01

    Modernization of instrumentation and control (I and C) systems in nuclear power plants often implies to go from analog to digital systems. One condition for the upgrade to be successful is that the new systems achieve at least the same quality level as the analog they replace. The most important part of digital systems quality assurance (QA) is verification and validation (V and V). V and V is concerned with the process as much as the product, it is a systematic program of review and testing activities performed throughout the system development life cycle. Briefly, we can say that verification is to build the product correctly, and validation is to build the correct product. Since V and V is necessary but costly, it is helpful to tailor the effort that should be performed to achieve the quality goal for each particular case. To do this, an accepted practice is to establish different V and V levels, each one with a proper degree of stringency or rigor. This paper shows a practical approach to estimate the appropriate level of V and V, and the resulting V and V techniques recommended for each specific system. The firs step purposed is to determine 'What to do', that is the selection of the V and V class. The main factors considered here are: required integrity, functional complexity, defense in depth and development environment. A guideline to classify the particular system using these factors and show how they lead to the selection of the V and V class is presented. The second step is to determine 'How to do it', that is to choose an appropriate set of V and V methods according to the attributes of the system and the V and V class already selected. A list of possible V and V methods that are recommended for each V and V level during different stages of the development life cycle is included. As a result of the application of this procedure, solutions are found for generalists interested in 'What to do', as well as for specialists, interested in 'How to do'. Finally

  17. Variable Pitch Approach for Performance Improving of Straight-Bladed VAWT at Rated Tip Speed Ratio

    Directory of Open Access Journals (Sweden)

    Zhenzhou Zhao

    2018-06-01

    Full Text Available This paper presents a new variable pitch (VP approach to increase the peak power coefficient of the straight-bladed vertical-axis wind turbine (VAWT, by widening the azimuthal angle band of the blade with the highest aerodynamic torque, instead of increasing the highest torque. The new VP-approach provides a curve of pitch angle designed for the blade operating at the rated tip speed ratio (TSR corresponding to the peak power coefficient of the fixed pitch (FP-VAWT. The effects of the new approach are exploited by using the double multiple stream tubes (DMST model and Prandtl’s mathematics to evaluate the blade tip loss. The research describes the effects from six aspects, including the lift, drag, angle of attack (AoA, resultant velocity, torque, and power output, through a comparison between VP-VAWTs and FP-VAWTs working at four TSRs: 4, 4.5, 5, and 5.5. Compared with the FP-blade, the VP-blade has a wider azimuthal zone with the maximum AoA, lift, drag, and torque in the upwind half-cycle, and yields the two new larger maximum values in the downwind half-cycle. The power distribution in the swept area of the turbine changes from an arched shape of the FP-VAWT into the rectangular shape of the VP-VAWT. The new VP-approach markedly widens the highest-performance zone of the blade in a revolution, and ultimately achieves an 18.9% growth of the peak power coefficient of the VAWT at the optimum TSR. Besides achieving this growth, the new pitching method will enhance the performance at TSRs that are higher than current optimal values, and an increase of torque is also generated.

  18. Multicore Performance of Block Algebraic Iterative Reconstruction Methods

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik B.; Hansen, Per Christian

    2014-01-01

    Algebraic iterative methods are routinely used for solving the ill-posed sparse linear systems arising in tomographic image reconstruction. Here we consider the algebraic reconstruction technique (ART) and the simultaneous iterative reconstruction techniques (SIRT), both of which rely on semiconv......Algebraic iterative methods are routinely used for solving the ill-posed sparse linear systems arising in tomographic image reconstruction. Here we consider the algebraic reconstruction technique (ART) and the simultaneous iterative reconstruction techniques (SIRT), both of which rely...... on semiconvergence. Block versions of these methods, based on a partitioning of the linear system, are able to combine the fast semiconvergence of ART with the better multicore properties of SIRT. These block methods separate into two classes: those that, in each iteration, access the blocks in a sequential manner...... a fixed relaxation parameter in each method, namely, the one that leads to the fastest semiconvergence. Computational results show that for multicore computers, the sequential approach is preferable....

  19. Communities of Practice: A Research Paradigm for the Mixed Methods Approach

    Science.gov (United States)

    Denscombe, Martyn

    2008-01-01

    The mixed methods approach has emerged as a "third paradigm" for social research. It has developed a platform of ideas and practices that are credible and distinctive and that mark the approach out as a viable alternative to quantitative and qualitative paradigms. However, there are also a number of variations and inconsistencies within the mixed…

  20. Spatially adaptive hp refinement approach for PN neutron transport equation using spectral element method

    International Nuclear Information System (INIS)

    Nahavandi, N.; Minuchehr, A.; Zolfaghari, A.; Abbasi, M.

    2015-01-01

    Highlights: • Powerful hp-SEM refinement approach for P N neutron transport equation has been presented. • The method provides great geometrical flexibility and lower computational cost. • There is a capability of using arbitrary high order and non uniform meshes. • Both posteriori and priori local error estimation approaches have been employed. • High accurate results are compared against other common adaptive and uniform grids. - Abstract: In this work we presented the adaptive hp-SEM approach which is obtained from the incorporation of Spectral Element Method (SEM) and adaptive hp refinement. The SEM nodal discretization and hp adaptive grid-refinement for even-parity Boltzmann neutron transport equation creates powerful grid refinement approach with high accuracy solutions. In this regard a computer code has been developed to solve multi-group neutron transport equation in one-dimensional geometry using even-parity transport theory. The spatial dependence of flux has been developed via SEM method with Lobatto orthogonal polynomial. Two commonly error estimation approaches, the posteriori and the priori has been implemented. The incorporation of SEM nodal discretization method and adaptive hp grid refinement leads to high accurate solutions. Coarser meshes efficiency and significant reduction of computer program runtime in comparison with other common refining methods and uniform meshing approaches is tested along several well-known transport benchmarks

  1. Assessing vocal performance in complex birdsong: a novel approach.

    Science.gov (United States)

    Geberzahn, Nicole; Aubin, Thierry

    2014-08-06

    Vocal performance refers to the ability to produce vocal signals close to physical limits. Such motor skills can be used by conspecifics to assess a signaller's competitive potential. For example it is difficult for birds to produce repeated syllables both rapidly and with a broad frequency bandwidth. Deviation from an upper-bound regression of frequency bandwidth on trill rate has been widely used to assess vocal performance. This approach is, however, only applicable to simple trilled songs, and even then may be affected by differences in syllable complexity. Using skylarks (Alauda arvensis) as a birdsong model with a very complex song structure, we detected another performance trade-off: minimum gap duration between syllables was longer when the frequency ratio between the end of one syllable and the start of the next syllable (inter-syllable frequency shift) was large. This allowed us to apply a novel measure of vocal performance ¿ vocal gap deviation: the deviation from a lower-bound regression of gap duration on inter-syllable frequency shift. We show that skylarks increase vocal performance in an aggressive context suggesting that this trait might serve as a signal for competitive potential. We suggest using vocal gap deviation in future studies to assess vocal performance in songbird species with complex structure.

  2. How could the replica method improve accuracy of performance assessment of channel coding?

    Energy Technology Data Exchange (ETDEWEB)

    Kabashima, Yoshiyuki [Department of Computational Intelligence and Systems Science, Tokyo Institute of technology, Yokohama 226-8502 (Japan)], E-mail: kaba@dis.titech.ac.jp

    2009-12-01

    We explore the relation between the techniques of statistical mechanics and information theory for assessing the performance of channel coding. We base our study on a framework developed by Gallager in IEEE Trans. Inform. Theory IT-11, 3 (1965), where the minimum decoding error probability is upper-bounded by an average of a generalized Chernoff's bound over a code ensemble. We show that the resulting bound in the framework can be directly assessed by the replica method, which has been developed in statistical mechanics of disordered systems, whereas in Gallager's original methodology further replacement by another bound utilizing Jensen's inequality is necessary. Our approach associates a seemingly ad hoc restriction with respect to an adjustable parameter for optimizing the bound with a phase transition between two replica symmetric solutions, and can improve the accuracy of performance assessments of general code ensembles including low density parity check codes, although its mathematical justification is still open.

  3. Regulatory approach to enhanced human performance during accidents

    International Nuclear Information System (INIS)

    Palla, R.L. Jr.

    1990-01-01

    It has become increasingly clear in recent years that the risk associated with nuclear power is driven by human performance. Although human errors have contributed heavily to the two core-melt events that have occurred at power reactors, effective performance during an event can also prevent a degraded situation from progressing to a more serious accident, as in the loss-of-feedwater event at Davis-Besse. Sensitivity studies in which human error rates for various categories of errors in a probabilistic risk assessment (PRA) were varied confirm the importance of human performance. Moreover, these studies suggest that actions taken during an accident are at least as important as errors that occur prior to an initiating event. A program that will lead to enhanced accident management capabilities in the nuclear industry is being developed by the US Nuclear Regulatory Commission (NRC) and industry and is a key element in NRC's integration plan for closure of severe-accident issues. The focus of the accident management (AM) program is on human performance during accidents, with emphasis on in-plant response. The AM program extends the defense-in-depth principle to plant operating staff. The goal is to take advantage of existing plant equipment and operator skills and creativity to find ways to terminate accidents that are beyond the design basis. The purpose of this paper is to describe the NRC's objectives and approach in AM as well as to discuss several human performance issues that are central to AM

  4. Human performance assessment: methods and measures

    International Nuclear Information System (INIS)

    Andresen, Gisle; Droeivoldsmo, Asgeir

    2000-10-01

    The Human Error Analysis Project (HEAP) was initiated in 1994. The aim of the project was to acquire insights on how and why cognitive errors occur when operators are engaged in problem solving in advanced integrated control rooms. Since human error had not been studied in the HAlden Man-Machine LABoratory (HAMMLAB) before, it was also necessary to carry out research in methodology. In retrospect, it is clear that much of the methodological work is relevant to human-machine research in general, and not only to research on human error. The purpose of this report is, therefore, to give practitioners and researchers an overview of the methodological parts of HEAP. The scope of the report is limited to methods used throughout the data acquisition process, i.e., data-collection methods, data-refinement methods, and measurement methods. The data-collection methods include various types of verbal protocols, simulator logs, questionnaires, and interviews. Data-refinement methods involve different applications of the Eyecon system, a flexible data-refinement tool, and small computer programs used for rearranging, reformatting, and aggregating raw-data. Measurement methods involve assessment of diagnostic behaviour, erroneous actions, complexity, task/system performance, situation awareness, and workload. The report concludes that the data-collection methods are generally both reliable and efficient. The data-refinement methods, however, should be easier to use in order to facilitate explorative analyses. Although the series of experiments provided an opportunity for measurement validation, there are still uncertainties connected to several measures, due to their reliability still being unknown. (Author). 58 refs.,7 tabs

  5. Using Financial Information in Continuing Education. Accepted Methods and New Approaches.

    Science.gov (United States)

    Matkin, Gary W.

    This book, which is intended as a resource/reference guide for experienced financial managers and course planners, examines accepted methods and new approaches for using financial information in continuing education. The introduction reviews theory and practice, traditional and new methods, planning and organizational management, and technology.…

  6. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    Directory of Open Access Journals (Sweden)

    Fusheng Wang

    2013-01-01

    Full Text Available Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS data model. Aims: (1 Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2 Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3 Develop a set of queries to support data sampling and result comparisons; (4 Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1 algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2 algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The

  7. Design and Optimization of Composite Automotive Hatchback Using Integrated Material-Structure-Process-Performance Method

    Science.gov (United States)

    Yang, Xudong; Sun, Lingyu; Zhang, Cheng; Li, Lijun; Dai, Zongmiao; Xiong, Zhenkai

    2018-03-01

    The application of polymer composites as a substitution of metal is an effective approach to reduce vehicle weight. However, the final performance of composite structures is determined not only by the material types, structural designs and manufacturing process, but also by their mutual restrict. Hence, an integrated "material-structure-process-performance" method is proposed for the conceptual and detail design of composite components. The material selection is based on the principle of composite mechanics such as rule of mixture for laminate. The design of component geometry, dimension and stacking sequence is determined by parametric modeling and size optimization. The selection of process parameters are based on multi-physical field simulation. The stiffness and modal constraint conditions were obtained from the numerical analysis of metal benchmark under typical load conditions. The optimal design was found by multi-discipline optimization. Finally, the proposed method was validated by an application case of automotive hatchback using carbon fiber reinforced polymer. Compared with the metal benchmark, the weight of composite one reduces 38.8%, simultaneously, its torsion and bending stiffness increases 3.75% and 33.23%, respectively, and the first frequency also increases 44.78%.

  8. Advanced fabrication method for the preparation of MOF thin films: Liquid-phase epitaxy approach meets spin coating method.

    KAUST Repository

    Chernikova, Valeriya; Shekhah, Osama; Eddaoudi, Mohamed

    2016-01-01

    Here we report a new and advanced method for the fabrication of highly oriented/polycrystalline metal-organic framework (MOF) thin films. Building on the attractive features of the liquid-phase epitaxy (LPE) approach, a facile spin coating method

  9. Towards a realistic approach to validation of reactive transport models for performance assessment

    International Nuclear Information System (INIS)

    Siegel, M.D.

    1993-01-01

    Performance assessment calculations are based on geochemical models that assume that interactions among radionuclides, rocks and groundwaters under natural conditions, can be estimated or bound by data obtained from laboratory-scale studies. The data include radionuclide distribution coefficients, measured in saturated batch systems of powdered rocks, and retardation factors measured in short-term column experiments. Traditional approaches to model validation cannot be applied in a straightforward manner to the simple reactive transport models that use these data. An approach to model validation in support of performance assessment is described in this paper. It is based on a recognition of different levels of model validity and is compatible with the requirements of current regulations for high-level waste disposal. Activities that are being carried out in support of this approach include (1) laboratory and numerical experiments to test the validity of important assumptions inherent in current performance assessment methodologies,(2) integrated transport experiments, and (3) development of a robust coupled reaction/transport code for sensitivity analyses using massively parallel computers

  10. An efficient Bayesian inference approach to inverse problems based on an adaptive sparse grid collocation method

    International Nuclear Information System (INIS)

    Ma Xiang; Zabaras, Nicholas

    2009-01-01

    A new approach to modeling inverse problems using a Bayesian inference method is introduced. The Bayesian approach considers the unknown parameters as random variables and seeks the probabilistic distribution of the unknowns. By introducing the concept of the stochastic prior state space to the Bayesian formulation, we reformulate the deterministic forward problem as a stochastic one. The adaptive hierarchical sparse grid collocation (ASGC) method is used for constructing an interpolant to the solution of the forward model in this prior space which is large enough to capture all the variability/uncertainty in the posterior distribution of the unknown parameters. This solution can be considered as a function of the random unknowns and serves as a stochastic surrogate model for the likelihood calculation. Hierarchical Bayesian formulation is used to derive the posterior probability density function (PPDF). The spatial model is represented as a convolution of a smooth kernel and a Markov random field. The state space of the PPDF is explored using Markov chain Monte Carlo algorithms to obtain statistics of the unknowns. The likelihood calculation is performed by directly sampling the approximate stochastic solution obtained through the ASGC method. The technique is assessed on two nonlinear inverse problems: source inversion and permeability estimation in flow through porous media

  11. Methodical approach to financial stimulation of logistics managers

    OpenAIRE

    Melnykova Kateryna V.

    2014-01-01

    The article offers a methodical approach to financial stimulation of logistics managers, which allows calculation of the incentive amount with consideration of profit obtained from introduction of optimisation logistics solutions. The author generalises measures, which would allow increase of stimulation of labour of logistics managers by the enterprise top managers. The article marks out motivation factors, which exert influence upon relation of logistics managers to execution of optimisatio...

  12. Application of Quality by Design Approach to Bioanalysis: Development of a Method for Elvitegravir Quantification in Human Plasma.

    Science.gov (United States)

    Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo

    2017-10-01

    The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.

  13. Performance of the lot quality assurance sampling method compared to surveillance for identifying inadequately-performing areas in Matlab, Bangladesh.

    Science.gov (United States)

    Bhuiya, Abbas; Hanifi, S M A; Roy, Nikhil; Streatfield, P Kim

    2007-03-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population.

  14. Environmental investment and firm performance: A network approach

    International Nuclear Information System (INIS)

    Bostian, Moriah; Färe, Rolf; Grosskopf, Shawna; Lundgren, Tommy

    2016-01-01

    This study examines the role of investment in environmental production practices for both environmental performance and energy efficiency over time. We employ a network DEA approach that links successive production technologies through intertemporal investment decisions with a period by period estimation. This allows us to estimate energy efficiency and environmental performance separately, as well as productivity change and its associated decompositions into efficiency change and technology change. Incorporating a network model also allows us to account for both short-term environmental management practices and long-term environmental investments in each of our productivity measures. We apply this framework to a panel of detailed plant-level production data for Swedish manufacturing firms covering the years 2002–2008. - Highlights: • We use a network DEA model to account for intertemporal environmental investment decisionsin measures of firm productivity. • We apply our network technology model to a panel of firms in Sweden's pulp and paperindustry for the years 2002 - 2008. • We model environmental investments and expenditures separately from other productionoriented inputs. • We find evidence of positive relationships between energy efficiency, environmental performance, and firm productivity.

  15. Creative Approaches to Teaching Graduate Research Methods Workshops

    Directory of Open Access Journals (Sweden)

    Peter Reilly

    2017-06-01

    Full Text Available Engagement and deeper learning were enhanced by developing several innovative teaching strategies delivered in Research Methods workshops to Graduate Business Students.  Focusing primarily on students adopting a creative approach to formulating a valid research question for undertaking a dissertation successfully. These techniques are applicable to most subject domains to ensure student engagement.  Addressing the various multiple intelligences and learning styles existing within groups while ensuring these sessions are student centred and conducive to a collaborative learning environment.  Blogs, interactive tutorials, online videos, games and posters, are used to develop student’s cognitive and metacognitive abilities.  Using novelty images appeals to a groups’ intellectual curiosity, acting as an interpretive device to explain  the value of adopting a holistic rather than analytic approach towards a topic.

  16. A diagnosis method for physical systems using a multi-modeling approach

    International Nuclear Information System (INIS)

    Thetiot, R.

    2000-01-01

    In this thesis we propose a method for diagnosis problem solving. This method is based on a multi-modeling approach describing both normal and abnormal behavior of a system. This modeling approach allows to represent a system at different abstraction levels (behavioral, functional and teleological. Fundamental knowledge is described according to a bond-graph representation. We show that bond-graph representation can be exploited in order to generate (completely or partially) the functional models. The different models of the multi-modeling approach allows to define the functional state of a system at different abstraction levels. We exploit this property to exonerate sub-systems for which the expected behavior is observed. The behavioral and functional descriptions of the remaining sub-systems are exploited hierarchically in a two steps process. In a first step, the abnormal behaviors explaining some observations are identified. In a second step, the remaining unexplained observations are used to generate conflict sets and thus the consistency based diagnoses. The modeling method and the diagnosis process have been applied to a Reactor Coolant Pump Sets. This application illustrates the concepts described in this thesis and shows its potentialities. (authors)

  17. An analysis of clinical transition stresses experienced by dental students: A qualitative methods approach.

    Science.gov (United States)

    Botelho, M; Gao, X; Bhuyan, S Y

    2018-04-17

    Stress in dental students is well established with potential psychological distress, emotional exhaustion and burnout-related symptoms. Little attention has been given to the problems encountered by dental students during the transition from theoretical or paraclinical training to the clinical environment. The aim of this study was to adopt a qualitative research methods approach to understand the perceived stressors during students' clinical transition and provide insights for curriculum planners to enhance learning. This study analysed four groups of 2nd- and 3rd-year BDS students' experiences in focus group interviews relating to their pre-clinical and clinical transitions. The interviews were recorded and transcribed verbatim, and a thematic analysis was performed using an inductive qualitative approach. Key overlapping domains identified were the transition gap and stresses. The transition gap was subclassified into knowledge and skill (hard and soft), and stresses was subcategorised into internal and external stresses. On first coming to clinics, students experienced knowledge gaps of unfamiliar clinical treatments with mismatches between knowledge acquisition and clinical exposure. Students felt incompetent owing to the stresses attributable to curriculum design, staff and the patient. This negatively affected their confidence and clinical performance. A range of challenges have been identified that will allow curriculum designer's to plan a more supportive learning experience to help students during their transition to clinical practice giving them timely knowledge, confidence and clinical performance to better prepare them for entering clinics. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Investigation of Industrialised Building System Performance in Comparison to Conventional Construction Method

    Directory of Open Access Journals (Sweden)

    Othuman Mydin M.A.

    2014-03-01

    Full Text Available Conventional construction methods are still widely practised, although many researches have indicated that this method is less effective compared to the IBS construction method. The existence of the IBS has added to the many techniques within the construction industry. This study is aimed at making comparisons between the two construction approaches. Case studies were conducted at four sites in the state of Penang, Malaysia. Two projects were IBS-based while the remaining two deployed the conventional method of construction. Based on an analysis of the results, it can be concluded that the IBS approach has more to offer compared to the conventional method. Among these advantages are shorter construction periods, reduced overall costs, less labour needs, better site conditions and the production of higher quality components.

  19. Performance Poetry as a Method to Understand Disability

    Directory of Open Access Journals (Sweden)

    Lee-Ann Fenge

    2016-03-01

    Full Text Available The Seen but Seldom Heard project was a performative social science (PSS project which used performance poetry to illuminate the experiences of young people with physical impairments. Two performance poets, a group of young people with physical impairments, and academics from social science and media/communication backgrounds worked together to explore various aspects of the lived experience of disability exploring issues associated with identity, stereotypes, stigma and representation. In this article, we will present an overview of the project and consider how PSS offers a method to engage seldom heard voices, and illustrate this through two poems which shed light on the lived experience of disability. The article will consider the impact of these poems as PSS, and how this method allows the audience to develop a deeper understanding of the "lived" experience of disability and to reflect upon their own understandings of disability and discrimination. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1602118

  20. Towards better environmental performance of wastewater sludge treatment using endpoint approach in LCA methodology

    Directory of Open Access Journals (Sweden)

    Isam Alyaseri

    2017-03-01

    Full Text Available The aim of this study is to use the life cycle assessment method to measure the environmental performance of the sludge incineration process in a wastewater treatment plant and to propose an alternative that can reduce the environmental impact. To show the damages caused by the treatment processes, the study aimed to use an endpoint approach in evaluating impacts on human health, ecosystem quality, and resources due to the processes. A case study was taken at Bissell Point Wastewater Treatment Plant in Saint Louis, Missouri, U.S. The plant-specific data along with literature data from technical publications were used to build an inventory, and then analyzed the environmental burdens from sludge handling unit in the year 2011. The impact assessment method chosen was ReCipe 2008. The existing scenario (dewatering-multiple hearth incineration-ash to landfill was evaluated and three alternative scenarios (fluid bed incineration and anaerobic digestion with and without land application with energy recovery from heat or biogas were proposed and analyzed to find the one with the least environmental impact. The existing scenario shows that the most significant impacts are related to depletion in resources and damage to human health. These impacts mainly came from the operation phase (electricity and fuel consumption and emissions related to combustion. Alternatives showed better performance than the existing scenario. Using ReCipe endpoint methodology, and among the three alternatives tested, the anaerobic digestion had the best overall environmental performance. It is recommended to convert to fluid bed incineration if the concerns were more about human health or to anaerobic digestion if the concerns were more about depletion in resources. The endpoint approach may simplify the outcomes of this study as follows: if the plant is converted to fluid bed incineration, it could prevent an average of 43.2 DALYs in human life, save 0.059 species in the area

  1. Which Cooperative Ownership Model Performs Better? A Financial-Decision Aid Approach

    NARCIS (Netherlands)

    Kalogeras, N.; Pennings, J.M.E.; Benos, T.; Doumpos, M.

    2013-01-01

    In this article the financial/ownership structures of agribusiness cooperatives are analyzed to examine whether new cooperative models perform better than the more traditional ones. The assessment procedure introduces a new financial decision-aid approach, which is based on data-analysis techniques

  2. The New Performance Calculation Method of Fouled Axial Flow Compressor

    Directory of Open Access Journals (Sweden)

    Huadong Yang

    2014-01-01

    Full Text Available Fouling is the most important performance degradation factor, so it is necessary to accurately predict the effect of fouling on engine performance. In the previous research, it is very difficult to accurately model the fouled axial flow compressor. This paper develops a new performance calculation method of fouled multistage axial flow compressor based on experiment result and operating data. For multistage compressor, the whole compressor is decomposed into two sections. The first section includes the first 50% stages which reflect the fouling level, and the second section includes the last 50% stages which are viewed as the clean stage because of less deposits. In this model, the performance of the first section is obtained by combining scaling law method and linear progression model with traditional stage stacking method; simultaneously ambient conditions and engine configurations are considered. On the other hand, the performance of the second section is calculated by averaged infinitesimal stage method which is based on Reynolds’ law of similarity. Finally, the model is successfully applied to predict the 8-stage axial flow compressor and 16-stage LM2500-30 compressor. The change of thermodynamic parameters such as pressure ratio, efficiency with the operating time, and stage number is analyzed in detail.

  3. Meta-control of combustion performance with a data mining approach

    Science.gov (United States)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  4. Performance-based parameter tuning method of model-driven PID control systems.

    Science.gov (United States)

    Zhao, Y M; Xie, W F; Tu, X W

    2012-05-01

    In this paper, performance-based parameter tuning method of model-driven Two-Degree-of-Freedom PID (MD TDOF PID) control system has been proposed to enhance the control performances of a process. Known for its ability of stabilizing the unstable processes, fast tracking to the change of set points and rejecting disturbance, the MD TDOF PID has gained research interest recently. The tuning methods for the reported MD TDOF PID are based on internal model control (IMC) method instead of optimizing the performance indices. In this paper, an Integral of Time Absolute Error (ITAE) zero-position-error optimal tuning and noise effect minimizing method is proposed for tuning two parameters in MD TDOF PID control system to achieve the desired regulating and disturbance rejection performance. The comparison with Two-Degree-of-Freedom control scheme by modified smith predictor (TDOF CS MSP) and the designed MD TDOF PID tuned by the IMC tuning method demonstrates the effectiveness of the proposed tuning method. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Basis set approach in the constrained interpolation profile method

    International Nuclear Information System (INIS)

    Utsumi, T.; Koga, J.; Yabe, T.; Ogata, Y.; Matsunaga, E.; Aoki, T.; Sekine, M.

    2003-07-01

    We propose a simple polynomial basis-set that is easily extendable to any desired higher-order accuracy. This method is based on the Constrained Interpolation Profile (CIP) method and the profile is chosen so that the subgrid scale solution approaches the real solution by the constraints from the spatial derivative of the original equation. Thus the solution even on the subgrid scale becomes consistent with the master equation. By increasing the order of the polynomial, this solution quickly converges. 3rd and 5th order polynomials are tested on the one-dimensional Schroedinger equation and are proved to give solutions a few orders of magnitude higher in accuracy than conventional methods for lower-lying eigenstates. (author)

  6. MODERN APPROACHES REGARDING THE ASSESSMENT OF THE COMPANY’S OVERALL PERFORMANCES

    Directory of Open Access Journals (Sweden)

    Pintea Mirela-Oana

    2010-07-01

    Full Text Available The importance of this research can be supported by ambitious goals that are attributed to performance measurement / assessment of a company such as: improving company performance, optimization of company’s management and motivating staff. In the current context of sustainable development, the performance definition, measurement and maximizing system of companies becomes a complex one. The modern business environment demands a multi-goal orientation. Profit theory is no longer a valid measure of organizational performance and neither are other approaches that only take the interests of shareholders (owners of a company into account. Today’s business environment is characterized by the increasing importance and strength of various stakeholder groups. This paper captures the current mutations brought on the line of developing a modern system of assessment of company’s performance.

  7. A method for performance assessment of medical radioisotope equipment

    International Nuclear Information System (INIS)

    Kerin, T.; Slavtchev, Ath.; Nedeltchev, M.; Kjurktchiev, T.

    1984-01-01

    A variety of tests and procedures exist for the performance assessment of radioisotope diagnostic equipment. The complex performance index which has been introduced to date is based on an heuristic approach. The present work tries to interconnect algorithmically the most important factors as the influence of the measurement geometry, the statistic peculiarities for lower activities and the information loss at high count rates. All this is reflected in a criterion which integrates the spatial resolution, the effective detector's field of vision, the radionuclide's sensitivity, the background count rate and the effective dead-time of the system under investigation. (Auth.)

  8. Validated high performance liquid chromatographic (HPLC) method ...

    African Journals Online (AJOL)

    STORAGESEVER

    2010-02-22

    Feb 22, 2010 ... specific and accurate high performance liquid chromatographic method for determination of ZER in micro-volumes ... tional medicine as a cure for swelling, sores, loss of appetite and ... Receptor Activator for Nuclear Factor κ B Ligand .... The effect of ... be suitable for preclinical pharmacokinetic studies. The.

  9. Numerical Methods Application for Reinforced Concrete Elements-Theoretical Approach for Direct Stiffness Matrix Method

    Directory of Open Access Journals (Sweden)

    Sergiu Ciprian Catinas

    2015-07-01

    Full Text Available A detailed theoretical and practical investigation of the reinforced concrete elements is due to recent techniques and method that are implemented in the construction market. More over a theoretical study is a demand for a better and faster approach nowadays due to rapid development of the calculus technique. The paper above will present a study for implementing in a static calculus the direct stiffness matrix method in order capable to address phenomena related to different stages of loading, rapid change of cross section area and physical properties. The method is a demand due to the fact that in our days the FEM (Finite Element Method is the only alternative to such a calculus and FEM are considered as expensive methods from the time and calculus resources point of view. The main goal in such a method is to create the moment-curvature diagram in the cross section that is analyzed. The paper above will express some of the most important techniques and new ideas as well in order to create the moment curvature graphic in the cross sections considered.

  10. Ensemble of trees approaches to risk adjustment for evaluating a hospital's performance.

    Science.gov (United States)

    Liu, Yang; Traskin, Mikhail; Lorch, Scott A; George, Edward I; Small, Dylan

    2015-03-01

    A commonly used method for evaluating a hospital's performance on an outcome is to compare the hospital's observed outcome rate to the hospital's expected outcome rate given its patient (case) mix and service. The process of calculating the hospital's expected outcome rate given its patient mix and service is called risk adjustment (Iezzoni 1997). Risk adjustment is critical for accurately evaluating and comparing hospitals' performances since we would not want to unfairly penalize a hospital just because it treats sicker patients. The key to risk adjustment is accurately estimating the probability of an Outcome given patient characteristics. For cases with binary outcomes, the method that is commonly used in risk adjustment is logistic regression. In this paper, we consider ensemble of trees methods as alternatives for risk adjustment, including random forests and Bayesian additive regression trees (BART). Both random forests and BART are modern machine learning methods that have been shown recently to have excellent performance for prediction of outcomes in many settings. We apply these methods to carry out risk adjustment for the performance of neonatal intensive care units (NICU). We show that these ensemble of trees methods outperform logistic regression in predicting mortality among babies treated in NICU, and provide a superior method of risk adjustment compared to logistic regression.

  11. CONTEMPORARY APPROACHES OF COMPANY PERFORMANCE ANALYSIS BASED ON RELEVANT FINANCIAL INFORMATION

    Directory of Open Access Journals (Sweden)

    Sziki Klara

    2012-12-01

    Full Text Available In this paper we chose to present two components of the financial statements: the profit and loss account and the cash flow statement. These summary documents and different indicators calculated based on them allow us to formulate assessments on the performance and profitability on various functions and levels of the company’s activity. This paper aims to support the hypothesis that the accounting information presented in the profit and loss account and in the cash flow statement is an appropriate source for assessing company performance. The purpose of this research is to answer the question linked to the main hypothesis: Is it the profit and loss statement or the cash flow account that reflects better the performance of a business? Based on the literature of specialty studied we tried a conceptual, analytical and practical approach of the term performance, overviewing some terminological acceptations of the term performance as well as the main indicators of performance analysis on the basis of the profit and loss account and of the cash flow statement: aggregated indicators, also known as intermediary balances of administration, economic rate of return, rate of financial profitability, rate of return through cash flows, operating cash flow rate, rate of generating operating cash out of gross operating result. At the same time we had a comparative approach of the profit and loss account and cash flow statement, outlining the main advantages and disadvantages of these documents. In order to demonstrate the above theoretical assessments, we chose to analyze these indicators based on information from the financial statements of SC Sinteza SA, a company in Bihor county, listed on the Bucharest Stock Exchange.

  12. A Method To ModifyCorrect The Performance Of Amplifiers

    Directory of Open Access Journals (Sweden)

    Rohith Krishnan R

    2015-01-01

    Full Text Available Abstract The actual response of the amplifier may vary with the replacement of some aged or damaged components and this method is to compensate that problem. Here we use op-amp Fixator as the design tool. The tool helps us to isolate the selected circuit component from rest of the circuit adjust its operating point to correct the performance deviations and to modify the circuit without changing other parts of the circuit. A method to modifycorrect the performance of amplifiers by properly redesign the circuit is presented in this paper.

  13. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  14. [Biocybernetic approach to the thermometric methods of blood supply measurements of periodontal tissues].

    Science.gov (United States)

    Pastusiak, J; Zakrzewski, J

    1988-11-01

    Specific biocybernetic approach to the problem of the blood supply determination of paradontium tissues by means of thermometric methods has been presented in the paper. The compartment models of the measuring procedure have been given. Dilutodynamic methology and classification has been applied. Such an approach enables to select appropriate biophysical parameters describing the state of blood supply of paradontium tissues and optimal design of transducers and measuring methods.

  15. Performance evaluation methods and instrumentation for mine ventilation fans

    Institute of Scientific and Technical Information of China (English)

    LI Man; WANG Xue-rong

    2009-01-01

    Ventilation fans are one of the most important pieces of equipment in coal mines. Their performance plays an important role in the safety of staff and production. Given the actual requirements of coal mine production, we instituted a research project on the measurement methods of key performance parameters such as wind pressure, amount of ventilation and power. At the end a virtual instrument for mine ventilation fans performance evaluation was developed using a USB interface. The practical perform-ance and analytical results of our experiments show that it is feasible, reliable and effective to use the proposed instrumentation for mine ventilation performance evaluation.

  16. Performance and scaling of locally-structured grid methods forpartial differential equations

    Energy Technology Data Exchange (ETDEWEB)

    Colella, Phillip; Bell, John; Keen, Noel; Ligocki, Terry; Lijewski, Michael; Van Straalen, Brian

    2007-07-19

    In this paper, we discuss some of the issues in obtaining high performance for block-structured adaptive mesh refinement software for partial differential equations. We show examples in which AMR scales to thousands of processors. We also discuss a number of metrics for performance and scalability that can provide a basis for understanding the advantages and disadvantages of this approach.

  17. A service based estimation method for MPSoC performance modelling

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan; Jensen, Bjørn Sand

    2008-01-01

    This paper presents an abstract service based estimation method for MPSoC performance modelling which allows fast, cycle accurate design space exploration of complex architectures including multi processor configurations at a very early stage in the design phase. The modelling method uses a service...... oriented model of computation based on Hierarchical Colored Petri Nets and allows the modelling of both software and hardware in one unified model. To illustrate the potential of the method, a small MPSoC system, developed at Bang & Olufsen ICEpower a/s, is modelled and performance estimates are produced...

  18. A machine learning approach for efficient uncertainty quantification using multiscale methods

    Science.gov (United States)

    Chan, Shing; Elsheikh, Ahmed H.

    2018-02-01

    Several multiscale methods account for sub-grid scale features using coarse scale basis functions. For example, in the Multiscale Finite Volume method the coarse scale basis functions are obtained by solving a set of local problems over dual-grid cells. We introduce a data-driven approach for the estimation of these coarse scale basis functions. Specifically, we employ a neural network predictor fitted using a set of solution samples from which it learns to generate subsequent basis functions at a lower computational cost than solving the local problems. The computational advantage of this approach is realized for uncertainty quantification tasks where a large number of realizations has to be evaluated. We attribute the ability to learn these basis functions to the modularity of the local problems and the redundancy of the permeability patches between samples. The proposed method is evaluated on elliptic problems yielding very promising results.

  19. Comparison of wind mill cluster performance: A multicriteria approach

    Energy Technology Data Exchange (ETDEWEB)

    Rajakumar, D.G.; Nagesha, N. [Visvesvaraya Technological Univ., Karnataka (India)

    2012-07-01

    Energy is a crucial input for the economic and social development of any nation. Both renewable and non-renewable energy contribute in meeting the total requirement of the economy. As an affordable and clean energy source, wind energy is amongst the world's fastest growing renewable energy forms. Though there are several wind-mill clusters producing energy in different geographical locations, evaluating their performance is a complex task and not much of literature is available in this area. In this backdrop, an attempt is made in the current paper to estimate the performance of a wind-mill cluster through an index called Cluster Performance Index (CPI) adopting a multi-criteria approach. The proposed CPI comprises four criteria viz., Technical Performance Indicators (TePI), Economic Performance Indicators (EcPI), Environmental Performance Indicators (EnPI), and Sociological Performance Indicators (SoPI). Under each performance criterion a total of ten parameters are considered with five subjective and five objective oriented responses. The methodology is implemented by collecting empirical data from three wind-mill clusters located at Chitradurga, Davangere, and Gadag in the southern Indian State of Karnataka. Totally fifteen different stake holders are consulted through a set of structured researcher administered questionnaire to collect the relevant data in each wind farm. Stake holders involved engineers working in wind farms, wind farm developers, Government officials from energy department and a few selected residential people near the wind farms. The results of the study revealed that Chitradurga wind farm performed much better with a CPI of 45.267 as compared to Gadag (CPI of 28.362) and Davangere (CPI of 19.040) wind farms. (Author)

  20. Influence of discretization method on the digital control system performance

    Directory of Open Access Journals (Sweden)

    Futás József

    2003-12-01

    Full Text Available The design of control system can be divided into two steps. First the process or plant have to be convert into mathematical model form, so that its behavior can be analyzed. Then an appropriate controller have to be design in order to get the desired response of the controlled system. In the continuous time domain the system is represented by differential equations. Replacing a continuous system into discrete time form is always an approximation of the continuous system. The different discretization methods give different digital controller performance. The methods presented on the paper are Step Invariant or Zero Order Hold (ZOH Method, Matched Pole-Zero Method, Backward difference Method and Bilinear transformation. The above mentioned discretization methods are used in developing PI position controller of a dc motor. The motor model was converted by the ZOH method. The performances of the different methods are compared and the results are presented.

  1. A Comparison of Student Academic Performance with Traditional, Online, And Flipped Instructional Approaches in a C# Programming Course

    Directory of Open Access Journals (Sweden)

    Jason H. Sharp

    2017-08-01

    Full Text Available Aim/Purpose: Compared student academic performance on specific course requirements in a C# programming course across three instructional approaches: traditional, online, and flipped. Background: Addressed the following research question: When compared to the online and traditional instructional approaches, does the flipped instructional approach have a greater impact on student academic performance with specific course requirements in a C# programming course? Methodology: Quantitative research design conducted over eight 16-week semesters among a total of 271 participants who were undergraduate students en-rolled in a C# programming course. Data collected were grades earned from specific course requirements and were analyzed with the nonparametric Kruskal Wallis H-Test using IBM SPSS Statistics, Version 23. Contribution: Provides empirical findings related to the impact that different instructional approaches have on student academic performance in a C# programming course. Also describes implications and recommendations for instructors of programming courses regarding instructional approaches that facilitate active learning, student engagement, and self-regulation. Findings: Resulted in four statistically significant findings, indicating that the online and flipped instructional approaches had a greater impact on student academic performance than the traditional approach. Recommendations for Practitioners: Implement instructional approaches such as online, flipped, or blended which foster active learning, student engagement, and self-regulation to increase student academic performance. Recommendation for Researchers: Build upon this study and others similar to it to include factors such as gender, age, ethnicity, and previous academic history. Impact on Society: Acknowledge the growing influence of technology on society as a whole. Higher education coursework and programs are evolving to encompass more digitally-based learning contexts, thus

  2. A Combined Social Action, Mixed Methods Approach to Vocational Guidance Efficacy Research

    Science.gov (United States)

    Perry, Justin C.

    2009-01-01

    This article proposes a social action, mixed methods approach to verifying the efficacy of vocational guidance programs. Research strategies are discussed in the context of how the processes and purposes of efficacy research have been conceptualized and studied in vocational psychology. Examples of how to implement this approach in future efficacy…

  3. Implementing a Flipped Classroom Approach in a University Numerical Methods Mathematics Course

    Science.gov (United States)

    Johnston, Barbara M.

    2017-01-01

    This paper describes and analyses the implementation of a "flipped classroom" approach, in an undergraduate mathematics course on numerical methods. The approach replaced all the lecture contents by instructor-made videos and was implemented in the consecutive years 2014 and 2015. The sequential case study presented here begins with an…

  4. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    Science.gov (United States)

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    Science.gov (United States)

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  6. Method to Increase Undergraduate Laboratory Student Confidence in Performing Independent Research

    Directory of Open Access Journals (Sweden)

    Colton E. Kempton

    2017-05-01

    Full Text Available The goal of an undergraduate laboratory course should be not only to introduce the students to biology methodologies and techniques, but also to teach them independent analytical thinking skills and proper experiment design.  This is especially true for advanced biology laboratory courses that undergraduate students typically take as a junior or senior in college.  Many courses achieve the goal of teaching techniques, but fail to approach the larger goal of teaching critical thinking, experimental design, and student independence.  Here we describe a study examining the application of the scaffolding instructional philosophy in which students are taught molecular techniques with decreasing guidance to force the development of analytical thinking skills and prepare undergraduate students for independent laboratory research. This method was applied to our advanced molecular biology laboratory class and resulted in an increase of confidence among the undergraduate students in their abilities to perform independent research.

  7. A scalable and accurate method for classifying protein-ligand binding geometries using a MapReduce approach.

    Science.gov (United States)

    Estrada, T; Zhang, B; Cicotti, P; Armen, R S; Taufer, M

    2012-07-01

    We present a scalable and accurate method for classifying protein-ligand binding geometries in molecular docking. Our method is a three-step process: the first step encodes the geometry of a three-dimensional (3D) ligand conformation into a single 3D point in the space; the second step builds an octree by assigning an octant identifier to every single point in the space under consideration; and the third step performs an octree-based clustering on the reduced conformation space and identifies the most dense octant. We adapt our method for MapReduce and implement it in Hadoop. The load-balancing, fault-tolerance, and scalability in MapReduce allow screening of very large conformation spaces not approachable with traditional clustering methods. We analyze results for docking trials for 23 protein-ligand complexes for HIV protease, 21 protein-ligand complexes for Trypsin, and 12 protein-ligand complexes for P38alpha kinase. We also analyze cross docking trials for 24 ligands, each docking into 24 protein conformations of the HIV protease, and receptor ensemble docking trials for 24 ligands, each docking in a pool of HIV protease receptors. Our method demonstrates significant improvement over energy-only scoring for the accurate identification of native ligand geometries in all these docking assessments. The advantages of our clustering approach make it attractive for complex applications in real-world drug design efforts. We demonstrate that our method is particularly useful for clustering docking results using a minimal ensemble of representative protein conformational states (receptor ensemble docking), which is now a common strategy to address protein flexibility in molecular docking. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Parametric Approach to Assessing Performance of High-Lift Device Active Flow Control Architectures

    Directory of Open Access Journals (Sweden)

    Yu Cai

    2017-02-01

    Full Text Available Active Flow Control is at present an area of considerable research, with multiple potential aircraft applications. While the majority of research has focused on the performance of the actuators themselves, a system-level perspective is necessary to assess the viability of proposed solutions. This paper demonstrates such an approach, in which major system components are sized based on system flow and redundancy considerations, with the impacts linked directly to the mission performance of the aircraft. Considering the case of a large twin-aisle aircraft, four distinct active flow control architectures that facilitate the simplification of the high-lift mechanism are investigated using the demonstrated approach. The analysis indicates a very strong influence of system total mass flow requirement on architecture performance, both for a typical mission and also over the entire payload-range envelope of the aircraft.

  9. Approaching Pomeranchuk instabilities from ordered phase: A crossing-symmetric equation method

    International Nuclear Information System (INIS)

    Reidy, Kelly; Quader, Khandker; Bedell, Kevin

    2014-01-01

    We explore features of a 3D Fermi liquid near generalized Pomeranchuk instabilities using a tractable crossing-symmetric equation method. We approach the instabilities from the ordered ferromagnetic phase. We find “quantum multi-criticality” as approach to the ferromagnetic instability drives instability in other channel(s). It is found that a charge nematic instability precedes and is driven by Pomeranchuk instabilities in both the ℓ=0 spin and density channels

  10. Methods for Evaluating the Performance and Human Stress-Factors of Percussive Riveting

    Science.gov (United States)

    Ahn, Jonathan Y.

    The aerospace industry automates portions of their manufacturing and assembly processes. However, mechanics still remain vital to production, especially in areas where automated machines cannot fit, or have yet to match the quality of human craftsmanship. One such task is percussive riveting. Because percussive riveting is associated with a high risk of injury, these tool must be certified prior to release. The major contribution of this thesis is to develop a test bench capable of percussive riveting for ergonomic evaluation purposes. The major issues investigated are: (i) automate the tool evaluation method to be repeatable; (ii) demonstrate use of displacement and force sensors; and (iii) correlate performance and risk exposure of percussive tools. A test bench equipped with servomotors and pneumatic cylinders to control xyz-position of a rivet gun and bucking bar simultaneously, is used to explore this evaluation approach.

  11. Combining Qualitative and Quantitative Approaches: Some Arguments for Mixed Methods Research

    Science.gov (United States)

    Lund, Thorleif

    2012-01-01

    One purpose of the present paper is to elaborate 4 general advantages of the mixed methods approach. Another purpose is to propose a 5-phase evaluation design, and to demonstrate its usefulness for mixed methods research. The account is limited to research on groups in need of treatment, i.e., vulnerable groups, and the advantages of mixed methods…

  12. Development of multi-functional streetscape green infrastructure using a performance index approach

    International Nuclear Information System (INIS)

    Tiwary, A.; Williams, I.D.; Heidrich, O.; Namdeo, A.; Bandaru, V.; Calfapietra, C.

    2016-01-01

    This paper presents a performance evaluation framework for streetscape vegetation. A performance index (PI) is conceived using the following seven traits, specific to the street environments – Pollution Flux Potential (PFP), Carbon Sequestration Potential (CSP), Thermal Comfort Potential (TCP), Noise Attenuation Potential (NAP), Biomass Energy Potential (BEP), Environmental Stress Tolerance (EST) and Crown Projection Factor (CPF). Its application is demonstrated through a case study using fifteen street vegetation species from the UK, utilising a combination of direct field measurements and inventoried literature data. Our results indicate greater preference to small-to-medium size trees and evergreen shrubs over larger trees for streetscaping. The proposed PI approach can be potentially applied two-fold: one, for evaluation of the performance of the existing street vegetation, facilitating the prospects for further improving them through management strategies and better species selection; two, for planning new streetscapes and multi-functional biomass as part of extending the green urban infrastructure. - Highlights: • A performance evaluation framework for streetscape vegetation is presented. • Seven traits, relevant to street vegetation, are included in a performance index (PI). • The PI approach is applied to quantify and rank fifteen street vegetation species. • Medium size trees and evergreen shrubs are found more favourable for streetscapes. • The PI offers a metric for developing sustainable streetscape green infrastructure. - A performance index is developed and applied to fifteen vegetation species indicating greater preference to medium size trees and evergreen shrubs for streetscaping.

  13. A Co-Precursor Approach Coupled with a Supercritical Modification Method for Constructing Highly Transparent and Superhydrophobic Polymethylsilsesquioxane Aerogels.

    Science.gov (United States)

    Lei, Chaoshuai; Li, Junning; Sun, Chencheng; Yang, Hailong; Xia, Tao; Hu, Zijun; Zhang, Yue

    2018-03-30

    Polymethylsilsesquioxane (PMSQ) aerogels obtained from methyltrimethoxysilane (MTMS) are well-known high-performance porous materials. Highly transparent and hydrophobic PMSQ aerogel would play an important role in transparent vacuum insulation panels. Herein, the co-precursor approach and supercritical modification method were developed to prepare the PMSQ aerogels with high transparency and superhydrophobicity. Firstly, benefiting from the introduction of tetramethoxysilane (TMOS) in the precursor, the pore structure became more uniform and the particle size was decreased. As the TMOS content increased, the light transmittance increased gradually from 54.0% to 81.2%, whereas the contact angle of water droplet decreased from 141° to 99.9°, ascribed to the increase of hydroxyl groups on the skeleton surface. Hence, the supercritical modification method utilizing hexamethyldisilazane was also introduced to enhance the hydrophobic methyl groups on the aerogel's surface. As a result, the obtained aerogels revealed superhydrophobicity with a contact angle of 155°. Meanwhile, the developed surface modification method did not lead to any significant changes in the pore structure resulting in the superhydrophobic aerogel with a high transparency of 77.2%. The proposed co-precursor approach and supercritical modification method provide a new horizon in the fabrication of highly transparent and superhydrophobic PMSQ aerogels.

  14. Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study

    OpenAIRE

    Ilic, Dragan; Hart, William; Fiddes, Patrick; Misso, Marie; Villanueva, Elmer

    2013-01-01

    Background Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM. Methods A mixed-methods study was conducted consisting of a controlled trial and focus groups with second ye...

  15. Global approach for the validation of an in-line Raman spectroscopic method to determine the API content in real-time during a hot-melt extrusion process.

    Science.gov (United States)

    Netchacovitch, L; Thiry, J; De Bleye, C; Dumont, E; Cailletaud, J; Sacré, P-Y; Evrard, B; Hubert, Ph; Ziemons, E

    2017-08-15

    Since the Food and Drug Administration (FDA) published a guidance based on the Process Analytical Technology (PAT) approach, real-time analyses during manufacturing processes are in real expansion. In this study, in-line Raman spectroscopic analyses were performed during a Hot-Melt Extrusion (HME) process to determine the Active Pharmaceutical Ingredient (API) content in real-time. The method was validated based on a univariate and a multivariate approach and the analytical performances of the obtained models were compared. Moreover, on one hand, in-line data were correlated with the real API concentration present in the sample quantified by a previously validated off-line confocal Raman microspectroscopic method. On the other hand, in-line data were also treated in function of the concentration based on the weighing of the components in the prepared mixture. The importance of developing quantitative methods based on the use of a reference method was thus highlighted. The method was validated according to the total error approach fixing the acceptance limits at ±15% and the α risk at ±5%. This method reaches the requirements of the European Pharmacopeia norms for the uniformity of content of single-dose preparations. The validation proves that future results will be in the acceptance limits with a previously defined probability. Finally, the in-line validated method was compared with the off-line one to demonstrate its ability to be used in routine analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. METHODICAL APPROACHES TO THE COST MANAGEMENT OF INDUSTRIAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Trunina Iryna

    2018-03-01

    Full Text Available Introduction. The paper deals with the actual issues of managing the costs of industrial enterprises, because in the conditions of an unstable market environment the financial performance depends on the efficiency of the cost management system, competitiveness, financial sustainability and investment attractiveness of any subject of economic activity. Purpose of the article is analys is of approaches to cost management, theoretical substantiation and development of recommendations regarding the formation of strategic cost management. Results. The economic content of cost management in the treatment of different authors and on different approaches: functional, process-oriented and system approaches has been considered. Their essence and features, the direction for operational or strategic management of expenses of the enterprise, ways of spending management in different approaches are determined. It is stated that all considered approaches to cost management of enterprises are aimed at optimal use of resources and ensuring the growth of the efficiency of enterprises. Conclusions. Based on the review of methodological approaches to cost management, recommendations are developed for expanding the implementation of cost management at various levels of enterprise management and the formation of strategic cost management within the framework of strategic management of an enterprise. The strategic cost management is complex category aimed at achieving a rational level of costs in the long run, which allows for the consideration of competitive cost advantages and increase the competitiveness of an industrial enterprise. The implementation of cost reduction strategies should be a constant and important part of the company’s work, while the strategy of cost reduction should be integrated into the overall business strategy of the enterprise.

  17. Inverse kinetics method with source term for subcriticality measurements during criticality approach in the IPEN/MB-01 research reactor

    International Nuclear Information System (INIS)

    Loureiro, Cesar Augusto Domingues; Santos, Adimir dos

    2009-01-01

    In reactor physics tests which are performed at the startup after refueling the commercial PWRs, it is important to monitor subcriticality continuously during criticality approach. Reactivity measurements by the inverse kinetics method are widely used during the operation of a nuclear reactor and it is possible to perform an online reactivity measurement based on the point reactor kinetics equations. This technique is successful applied at sufficiently high power level or to a core without an external neutron source where the neutron source term in point reactor kinetics equations may be neglected. For operation at low power levels, the contribution of the neutron source must be taken into account and this implies the knowledge of a quantity proportional to the source strength, and then it should be determined. Some experiments have been performed in the IPEN/MB-01 Research Reactor for the determination of the Source Term, using the Least Square Inverse Kinetics Method (LSIKM). A digital reactivity meter which neglects the source term is used to calculate the reactivity and then the source term can be determined by the LSIKM. After determining the source term, its value can be added to the algorithm and the reactivity can be determined again, considering the source term. The new digital reactivity meter can be used now to monitor reactivity during the criticality approach and the measured value for the reactivity is more precise than the meter which neglects the source term. (author)

  18. The Feldenkrais Method: A Dynamic Approach to Changing Motor Behavior.

    Science.gov (United States)

    Buchanan, Patricia A.; Ulrich, Beverly D.

    2001-01-01

    Describes the Feldenkrais Method of somatic education, noting parallels with a dynamic systems theory (DST) approach to motor behavior. Feldenkrais uses movement and perception to foster individualized improvement in function. DST explains that a human-environment system continually adapts to changing conditions and assembles behaviors…

  19. A Novel Data Hierarchical Fusion Method for Gas Turbine Engine Performance Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Feng Lu

    2016-10-01

    Full Text Available Gas path fault diagnosis involves the effective utilization of condition-based sensor signals along engine gas path to accurately identify engine performance failure. The rapid development of information processing technology has led to the use of multiple-source information fusion for fault diagnostics. Numerous efforts have been paid to develop data-based fusion methods, such as neural networks fusion, while little research has focused on fusion architecture or the fusion of different method kinds. In this paper, a data hierarchical fusion using improved weighted Dempster–Shaffer evidence theory (WDS is proposed, and the integration of data-based and model-based methods is presented for engine gas-path fault diagnosis. For the purpose of simplifying learning machine typology, a recursive reduced kernel based extreme learning machine (RR-KELM is developed to produce the fault probability, which is considered as the data-based evidence. Meanwhile, the model-based evidence is achieved using particle filter-fuzzy logic algorithm (PF-FL by engine health estimation and component fault location in feature level. The outputs of two evidences are integrated using WDS evidence theory in decision level to reach a final recognition decision of gas-path fault pattern. The characteristics and advantages of two evidences are analyzed and used as guidelines for data hierarchical fusion framework. Our goal is that the proposed methodology provides much better performance of gas-path fault diagnosis compared to solely relying on data-based or model-based method. The hierarchical fusion framework is evaluated in terms to fault diagnosis accuracy and robustness through a case study involving fault mode dataset of a turbofan engine that is generated by the general gas turbine simulation. These applications confirm the effectiveness and usefulness of the proposed approach.

  20. Evaluating health worker performance in Benin using the simulated client method with real children.

    Science.gov (United States)

    Rowe, Alexander K; Onikpo, Faustin; Lama, Marcel; Deming, Michael S

    2012-10-08

    The simulated client (SC) method for evaluating health worker performance utilizes surveyors who pose as patients to make surreptitious observations during consultations. Compared to conspicuous observation (CO) by surveyors, which is commonly done in developing countries, SC data better reflect usual health worker practices. This information is important because CO can cause performance to be better than usual. Despite this advantage of SCs, the method's full potential has not been realized for evaluating performance for pediatric illnesses because real children have not been utilized as SCs. Previous SC studies used scenarios of ill children that were not actually brought to health workers. During a trial that evaluated a quality improvement intervention in Benin (the Integrated Management of Childhood Illness [IMCI] strategy), we conducted an SC survey with adult caretakers as surveyors and real children to evaluate the feasibility of this approach and used the results to assess the validity of CO. We conducted an SC survey and a CO survey (one right after the other) of health workers in the same 55 health facilities. A detailed description of the SC survey process was produced. Results of the two surveys were compared for 27 performance indicators using logistic regression modeling. SC and CO surveyors observed 54 and 185 consultations, respectively. No serious problems occurred during the SC survey. Performance levels measured by CO were moderately higher than those measured by SCs (median CO - SC difference = 16.4 percentage-points). Survey differences were sometimes much greater for IMCI-trained health workers (median difference = 29.7 percentage-points) than for workers without IMCI training (median difference = 3.1 percentage-points). SC surveys can be done safely with real children if appropriate precautions are taken. CO can introduce moderately large positive biases, and these biases might be greater for health workers exposed to quality improvement

  1. Multi-method and innovative approaches to researching the learning and social practices of young digital users

    DEFF Research Database (Denmark)

    Vittadini, Nicoletta; Carlo, Simone; Gilje, Øystein

    2014-01-01

    One of the most significant challenges in researching the social aspects of contemporary societies is to adapt the methodological approach to complex digital media environments. Learning processes take place in this complex environment, and they include formal and informal experiences (learning...... in school, home, and real-virtual communities), peer cultures and inter-generational connections, production and creation as relevant activities, and personal interests as a focal point. Methods used in the study of learning and the social practices of young people must take into account four key issues......: boundaries between online and offline experiences are blurring; young people act performatively, knowingly, or reflexively; and their activities cannot be understood through the use of a single method, but require the use of multiple tools of investigation. The article discusses three methodological issues...

  2. Multi-method and innovative approaches to researching the learning and social practices of young digital users

    DEFF Research Database (Denmark)

    Vittadini, Nicoletta; Carlo, Simone; Gilje, Øystein

    2012-01-01

    One of the most significant challenges in researching the social aspects of contemporary societies is to adapt the methodological approach to complex digital media environments. Learning processes take place in this complex environment, and they include formal and informal experiences (learning...... in school, home, and real-virtual communities), peer cultures and intergenerational connections, production and creation as relevant activities, and personal interests as a focal point. Methods used in the study of learning and the social practices of young people must take into account four key issues......: boundaries between online and offline experiences are blurring; young people act performatively; young people act knowingly or reflexively; and the activities of young people cannot be understood through the use of a single method but require the use of multiple tools of investigation. The article discusses...

  3. The Relationship between Motivation, Learning Approaches, Academic Performance and Time Spent

    Science.gov (United States)

    Everaert, Patricia; Opdecam, Evelien; Maussen, Sophie

    2017-01-01

    Previous literature calls for further investigation in terms of precedents and consequences of learning approaches (deep learning and surface learning). Motivation as precedent and time spent and academic performance as consequences are addressed in this paper. The study is administered in a first-year undergraduate course. Results show that the…

  4. A new approach to enhance the performance of decision tree for classifying gene expression data.

    Science.gov (United States)

    Hassan, Md; Kotagiri, Ramamohanarao

    2013-12-20

    Gene expression data classification is a challenging task due to the large dimensionality and very small number of samples. Decision tree is one of the popular machine learning approaches to address such classification problems. However, the existing decision tree algorithms use a single gene feature at each node to split the data into its child nodes and hence might suffer from poor performance specially when classifying gene expression dataset. By using a new decision tree algorithm where, each node of the tree consists of more than one gene, we enhance the classification performance of traditional decision tree classifiers. Our method selects suitable genes that are combined using a linear function to form a derived composite feature. To determine the structure of the tree we use the area under the Receiver Operating Characteristics curve (AUC). Experimental analysis demonstrates higher classification accuracy using the new decision tree compared to the other existing decision trees in literature. We experimentally compare the effect of our scheme against other well known decision tree techniques. Experiments show that our algorithm can substantially boost the classification performance of the decision tree.

  5. A Bio-inspired Approach for Power and Performance Aware Resource Allocation in Clouds

    Directory of Open Access Journals (Sweden)

    Kumar Rajesh

    2016-01-01

    Full Text Available In order to cope with increasing demand, cloud market players such as Amazon, Microsoft, Google, Gogrid, Flexiant, etc. have set up large sized data centers. Due to monotonically increasing size of data centers and heterogeneity of resources have made resource allocation a challenging task. A large percentage of total energy consumption of the data centers gets wasted because of under-utilization of resources. Thus, there is a need of resource allocation technique that improves the utilization of resources with effecting performance of services being delivered to end users. In this work, a bio-inspired resource allocation approach is proposed with the aim to improve utilization and hence the energy efficiency of the cloud infrastructure. The proposed approach makes use of Cuckoo search for power and performance aware allocation of resources to the services hired by the end users. The proposed approach is implemented in CloudSim. The simulation results have shown approximately 12% saving in energy consumption.

  6. Novel approaches in analysis of Fusarium mycotoxins in cereals employing ultra performance liquid chromatography coupled with high resolution mass spectrometry

    International Nuclear Information System (INIS)

    Zachariasova, M.; Lacina, O.; Malachova, A.; Kostelanska, M.; Poustka, J.; Godula, M.; Hajslova, J.

    2010-01-01

    Rapid, simple and cost-effective analytical methods with performance characteristics matching regulatory requirements are needed for effective control of occurrence of Fusarium toxins in cereals and cereal-based products to which they might be transferred during processing. Within this study, two alternative approaches enabling retrospective data analysis and identification of unknown signals in sample extracts have been implemented and validated for determination of 11 major Fusarium toxins. In both cases, ultra-high performance liquid chromatography (U-HPLC) coupled with high resolution mass spectrometry (HR MS) was employed. 13 C isotopically labeled surrogates as well as matrix-matched standards were employed for quantification. As far as time of flight mass analyzer (TOF-MS) was a detection tool, the use of modified QuEChERS (quick easy cheap effective rugged and safe) sample preparation procedure, widely employed in multi-pesticides residue analysis, was shown as an optimal approach to obtain low detection limits. The second challenging alternative, enabling direct analysis of crude extract, was the use of mass analyzer based on Orbitrap technology. In addition to demonstration of full compliance of the new methods with Commission Regulation (EC) No. 401/2006, also their potential to be used for confirmatory purposes according to Commission Decision 2002/657/EC has been critically assessed.

  7. Methodical approaches to assessment of quality of the bank loan portfolio

    Directory of Open Access Journals (Sweden)

    Tysyachna Yunna S.

    2014-01-01

    Full Text Available The goal of the article lies in the study of basic methodical approaches to assessment of the quality of the bank loan portfolio, identification of specific features of their practical application and justification of selection of the most appropriate for the modern economic conditions. The article considers three main groups of methods of assessment of the quality of the bank loan portfolio: expert evaluation methods and statistical and analytical methods. It goes without saying that in order to obtain an objective assessment of quality of the bank loan portfolio it is necessary to apply a complex approach, however, due to some advantages and shortcomings of the studied methods, the author marks expediency of building an integral indicator, taxonomic in particular, in order to obtain a complex, objective and efficient assessment of the bank loan portfolio. Prospects of further studies in this direction are assessment of the quality of the loan portfolio of the first group banks by the size of their assets through building integral taxonomic indicators and identification, on this basis, of factors that influence the quality of the loan portfolio with the aim of improvement of the mechanism of management of the bank lending activity.

  8. Electron and photon reconstruction and performance in ATLAS using a dynamical, topological cell clustering-based approach

    CERN Document Server

    The ATLAS collaboration

    2017-01-01

    The electron and photon reconstruction in ATLAS has moved towards the use of a dynamical, topo- logical cell-based approach for cluster building, owing to advancements in the calibration procedure which allow for such a method to be applied. The move to this new technique allows for improved measurements of electron and photon energies, particularly in situations where an electron radiates a bremsstrahlung photon, or a photon converts to an electron-poistron pair. This note details the changes to the ATLAS electron and photon reconstruction software, and assesses its performance under current LHC luminosity conditions using simulated data. Changes to the converted photon reconstruction are also detailed, which improve the reconstruction efficiency of double-track converted photons, as well as reducing the reconstruction of spurious one-track converted photons. The performance of the new reconstruction algorithm is also presented in a number of important topologies relevant to precision Standard Model physics,...

  9. On e-business strategy planning and performance evaluation: An adaptive algorithmic managerial approach

    Directory of Open Access Journals (Sweden)

    Alexandra Lipitakis

    2017-07-01

    Full Text Available A new e-business strategy planning and performance evaluation scheme based on adaptive algorithmic modelling techniques is presented. The effect of financial and non-financial performance of organizations on e-business strategy planning is investigated. The relationships between the four strategic planning parameters are examined, the directions of these relationships are given and six additional basic components are also considered. The new conceptual model has been constructed for e-business strategic planning and performance evaluation and an adaptive algorithmic modelling approach is presented. The new adaptive algorithmic modelling scheme including eleven dynamic modules, can be optimized and used effectively in e-business strategic planning and strategic planning evaluation of various e-services in very large organizations and businesses. A synoptic statistical analysis and comparative numerical results for the case of UK and Greece are given. The proposed e-business models indicate how e-business strategic planning may affect financial and non-financial performance in business and organizations by exploring whether models which are used for strategy planning can be applied to e-business planning and whether these models would be valid in different environments. A conceptual model has been constructed and qualitative research methods have been used for testing a predetermined number of considered hypotheses. The proposed models have been tested in the UK and Greece and the conclusions including numerical results and statistical analyses indicated existing relationships between considered dependent and independent variables. The proposed e-business models are expected to contribute to e-business strategy planning of businesses and organizations and managers should consider applying these models to their e-business strategy planning to improve their companies’ performances. This research study brings together elements of e

  10. The RISMC approach to perform advanced PRA analyses - 15332

    International Nuclear Information System (INIS)

    Mandelli, D.; Smith, C.; Riley, T.; Nielsen, J.; Alfonsi, A.; Rabiti, C.; Cogliati, J.

    2015-01-01

    The existing fleet of nuclear power plants is in the process of extending its lifetime and increasing the power generated from these plants via power up-rates. In order to evaluate the impact of these two factors on the safety of the plant, the RISMC (Risk Informed Safety Margin Characterization) Pathway aims to develop simulation-based tools and methods to assess risks for existing nuclear power plants in order to optimize safety. This pathway, by developing new methods, is extending the state-of-the-practice methods that have been traditionally based on logic structures such as Event-Trees and Fault-Trees. These static types of models mimic system response in an inductive and deductive way respectively, yet are restrictive in the ways they can represent spatial and temporal constructs. RISMC analyses are performed by using a combination of thermal-hydraulic codes and a stochastic analysis tool (RAVEN)currently under development at the Idaho National Laboratory. This paper presents a case study in order to show the capabilities of the RISMC methodology to assess impact of power up-rate of a boiling water reactor system during a station blackout accident scenario. We employ the system simulator code, RELAP5-3D, coupled with RAVEN which perform the stochastic analysis. Our analysis is in fact performed by: 1) sampling values of a set of parameters from the uncertainty space of interest, 2) simulating the system behavior for that specific set of parameter values and 3) analyzing the set of simulation runs. Results obtained give a detailed investigation of the issues associated with a plant power up-rate including the effects of station blackout accident scenarios. We are able to quantify how the timing of specific events was impacted by a higher nominal reactor core power. Such safety insights can provide useful information to the decision makers to perform risk informed margins management

  11. Towards integrating environmental performance in divisional performance measurement

    Directory of Open Access Journals (Sweden)

    Collins C Ngwakwe

    2014-08-01

    Full Text Available This paper suggests an integration of environmental performance measurement (EPM into conventional divisional financial performance measures as a catalyst to enhance managers’ drive toward cleaner production and sustainable development. The approach is conceptual and normative; and using a hypothetical firm, it suggests a model to integrate environmental performance measure as an ancillary to conventional divisional financial performance measures. Vroom’s motivation theory and other literature evidence indicate that corporate goals are achievable in an environment where managers’ efforts are recognised and thus rewarded. Consequently the paper suggests that environmentally motivated managers are important to propel corporate sustainability strategy toward desired corporate environmental governance and sustainable economic development. Thus this suggested approach modestly adds to existing environmental management accounting (EMA theory and literature. It is hoped that this paper may provide an agenda for further research toward a practical application of the suggested method in a firm.

  12. Validation approach for a fast and simple targeted screening method for 75 antibiotics in meat and aquaculture products using LC-MS/MS.

    Science.gov (United States)

    Dubreil, Estelle; Gautier, Sophie; Fourmond, Marie-Pierre; Bessiral, Mélaine; Gaugain, Murielle; Verdon, Eric; Pessel, Dominique

    2017-04-01

    An approach is described to validate a fast and simple targeted screening method for antibiotic analysis in meat and aquaculture products by LC-MS/MS. The strategy of validation was applied for a panel of 75 antibiotics belonging to different families, i.e., penicillins, cephalosporins, sulfonamides, macrolides, quinolones and phenicols. The samples were extracted once with acetonitrile, concentrated by evaporation and injected into the LC-MS/MS system. The approach chosen for the validation was based on the Community Reference Laboratory (CRL) guidelines for the validation of screening qualitative methods. The aim of the validation was to prove sufficient sensitivity of the method to detect all the targeted antibiotics at the level of interest, generally the maximum residue limit (MRL). A robustness study was also performed to test the influence of different factors. The validation showed that the method is valid to detect and identify 73 antibiotics of the 75 antibiotics studied in meat and aquaculture products at the validation levels.

  13. Performance assessment and optimisation of a large information system by combined customer relationship management and resilience engineering: a mathematical programming approach

    Science.gov (United States)

    Azadeh, A.; Foroozan, H.; Ashjari, B.; Motevali Haghighi, S.; Yazdanparast, R.; Saberi, M.; Torki Nejad, M.

    2017-10-01

    ISs and ITs play a critical role in large complex gas corporations. Many factors such as human, organisational and environmental factors affect IS in an organisation. Therefore, investigating ISs success is considered to be a complex problem. Also, because of the competitive business environment and the high amount of information flow in organisations, new issues like resilient ISs and successful customer relationship management (CRM) have emerged. A resilient IS will provide sustainable delivery of information to internal and external customers. This paper presents an integrated approach to enhance and optimise the performance of each component of a large IS based on CRM and resilience engineering (RE) in a gas company. The enhancement of the performance can help ISs to perform business tasks efficiently. The data are collected from standard questionnaires. It is then analysed by data envelopment analysis by selecting the optimal mathematical programming approach. The selected model is validated and verified by principle component analysis method. Finally, CRM and RE factors are identified as influential factors through sensitivity analysis for this particular case study. To the best of our knowledge, this is the first study for performance assessment and optimisation of large IS by combined RE and CRM.

  14. A Framework for Treating Uncertainty to Facilitate Waste Disposal Decision Making - Application of the Approach to GCD Performance Assessment

    International Nuclear Information System (INIS)

    Brown, T.J.; Cochran, J.R.; Gallegos, D.P.

    1999-01-01

    This paper presents an approach for treating uncertainty in the performance assessment process to efficiently address regulatory performance objectives for radioactive waste disposal and discusses the application of the approach at the Greater Confinement Disposal site. In this approach, the performance assessment methodology uses probabilistic risk assessment concepts to guide effective decisions about site characterization activities and provides a path toward reasonable assurance regarding regulatory compliance decisions. Although the approach is particularly amenable to requirements that are probabilistic in nature, the approach is also applicable to deterministic standards such as the dose-based and concentration-based requirements

  15. Approach to modeling of human performance for purposes of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1983-01-01

    This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described

  16. EEG cross-frequency coupling associated with attentional performance: An RDoC approach to attention

    NARCIS (Netherlands)

    Gerrits, B.J.L.; Vollebregt, M.A.; Olbrich, S.; Kessels, R.P.C.; Palmer, D.; Gordon, E.; Arns, M.W.

    2016-01-01

    19th biennial IPEG Meeting: Nijmegen, The Netherlands. 26-30 October 2016. The quality of attentional performance plays a crucial role in goaldirected behavior in daily life activities, cognitive task performance, and in multiple psychiatric illnesses. The Research Domain Criteria (RDoC) approach

  17. Proposed modifications of Environmental Protection Agency Method 1601 for detection of coliphages in drinking water, with same-day fluorescence-based detection and evaluation by the performance-based measurement system and alternative test protocol validation approaches.

    Science.gov (United States)

    Salter, Robert S; Durbin, Gregory W; Conklin, Ernestine; Rosen, Jeff; Clancy, Jennifer

    2010-12-01

    Coliphages are microbial indicators specified in the Ground Water Rule that can be used to monitor for potential fecal contamination of drinking water. The Total Coliform Rule specifies coliform and Escherichia coli indicators for municipal water quality testing; thus, coliphage indicator use is less common and advances in detection methodology are less frequent. Coliphages are viral structures and, compared to bacterial indicators, are more resistant to disinfection and diffuse further distances from pollution sources. Therefore, coliphage presence may serve as a better predictor of groundwater quality. This study describes Fast Phage, a 16- to 24-h presence/absence modification of U.S. Environmental Protection Agency (EPA) Method 1601 for detection of coliphages in 100 ml water. The objective of the study is to demonstrate that the somatic and male-specific coliphage modifications provide results equivalent to those of Method 1601. Five laboratories compared the modifications, featuring same-day fluorescence-based prediction, to Method 1601 by using the performance-based measurement system (PBMS) criterion. This requires a minimum 50% positive response in 10 replicates of 100-ml water samples at coliphage contamination levels of 1.3 to 1.5 PFU/100 ml. The laboratories showed that Fast Phage meets PBMS criteria with 83.5 to 92.1% correlation of the same-day rapid fluorescence-based prediction with the next-day result. Somatic coliphage PBMS data are compared to manufacturer development data that followed the EPA alternative test protocol (ATP) validation approach. Statistical analysis of the data sets indicates that PBMS utilizes fewer samples than does the ATP approach but with similar conclusions. Results support testing the coliphage modifications by using an EPA-approved national PBMS approach with collaboratively shared samples.

  18. A comparison of performance of several artificial intelligence methods for forecasting monthly discharge time series

    Science.gov (United States)

    Wang, Wen-Chuan; Chau, Kwok-Wing; Cheng, Chun-Tian; Qiu, Lin

    2009-08-01

    SummaryDeveloping a hydrological forecasting model based on past records is crucial to effective hydropower reservoir management and scheduling. Traditionally, time series analysis and modeling is used for building mathematical models to generate hydrologic records in hydrology and water resources. Artificial intelligence (AI), as a branch of computer science, is capable of analyzing long-series and large-scale hydrological data. In recent years, it is one of front issues to apply AI technology to the hydrological forecasting modeling. In this paper, autoregressive moving-average (ARMA) models, artificial neural networks (ANNs) approaches, adaptive neural-based fuzzy inference system (ANFIS) techniques, genetic programming (GP) models and support vector machine (SVM) method are examined using the long-term observations of monthly river flow discharges. The four quantitative standard statistical performance evaluation measures, the coefficient of correlation ( R), Nash-Sutcliffe efficiency coefficient ( E), root mean squared error (RMSE), mean absolute percentage error (MAPE), are employed to evaluate the performances of various models developed. Two case study river sites are also provided to illustrate their respective performances. The results indicate that the best performance can be obtained by ANFIS, GP and SVM, in terms of different evaluation criteria during the training and validation phases.

  19. Investigation on multi-objective performance optimization algorithm application of fan based on response surface method and entropy method

    Science.gov (United States)

    Zhang, Li; Wu, Kexin; Liu, Yang

    2017-12-01

    A multi-objective performance optimization method is proposed, and the problem that single structural parameters of small fan balance the optimization between the static characteristics and the aerodynamic noise is solved. In this method, three structural parameters are selected as the optimization variables. Besides, the static pressure efficiency and the aerodynamic noise of the fan are regarded as the multi-objective performance. Furthermore, the response surface method and the entropy method are used to establish the optimization function between the optimization variables and the multi-objective performances. Finally, the optimized model is found when the optimization function reaches its maximum value. Experimental data shows that the optimized model not only enhances the static characteristics of the fan but also obviously reduces the noise. The results of the study will provide some reference for the optimization of multi-objective performance of other types of rotating machinery.

  20. New algorithms and methods to estimate maximum-likelihood phylogenies: assessing the performance of PhyML 3.0.

    Science.gov (United States)

    Guindon, Stéphane; Dufayard, Jean-François; Lefort, Vincent; Anisimova, Maria; Hordijk, Wim; Gascuel, Olivier

    2010-05-01

    PhyML is a phylogeny software based on the maximum-likelihood principle. Early PhyML versions used a fast algorithm performing nearest neighbor interchanges to improve a reasonable starting tree topology. Since the original publication (Guindon S., Gascuel O. 2003. A simple, fast and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst. Biol. 52:696-704), PhyML has been widely used (>2500 citations in ISI Web of Science) because of its simplicity and a fair compromise between accuracy and speed. In the meantime, research around PhyML has continued, and this article describes the new algorithms and methods implemented in the program. First, we introduce a new algorithm to search the tree space with user-defined intensity using subtree pruning and regrafting topological moves. The parsimony criterion is used here to filter out the least promising topology modifications with respect to the likelihood function. The analysis of a large collection of real nucleotide and amino acid data sets of various sizes demonstrates the good performance of this method. Second, we describe a new test to assess the support of the data for internal branches of a phylogeny. This approach extends the recently proposed approximate likelihood-ratio test and relies on a nonparametric, Shimodaira-Hasegawa-like procedure. A detailed analysis of real alignments sheds light on the links between this new approach and the more classical nonparametric bootstrap method. Overall, our tests show that the last version (3.0) of PhyML is fast, accurate, stable, and ready to use. A Web server and binary files are available from http://www.atgc-montpellier.fr/phyml/.

  1. Isolating DNA from sexual assault cases: a comparison of standard methods with a nuclease-based approach

    Science.gov (United States)

    2012-01-01

    Background Profiling sperm DNA present on vaginal swabs taken from rape victims often contributes to identifying and incarcerating rapists. Large amounts of the victim’s epithelial cells contaminate the sperm present on swabs, however, and complicate this process. The standard method for obtaining relatively pure sperm DNA from a vaginal swab is to digest the epithelial cells with Proteinase K in order to solubilize the victim’s DNA, and to then physically separate the soluble DNA from the intact sperm by pelleting the sperm, removing the victim’s fraction, and repeatedly washing the sperm pellet. An alternative approach that does not require washing steps is to digest with Proteinase K, pellet the sperm, remove the victim’s fraction, and then digest the residual victim’s DNA with a nuclease. Methods The nuclease approach has been commercialized in a product, the Erase Sperm Isolation Kit (PTC Labs, Columbia, MO, USA), and five crime laboratories have tested it on semen-spiked female buccal swabs in a direct comparison with their standard methods. Comparisons have also been performed on timed post-coital vaginal swabs and evidence collected from sexual assault cases. Results For the semen-spiked buccal swabs, Erase outperformed the standard methods in all five laboratories and in most cases was able to provide a clean male profile from buccal swabs spiked with only 1,500 sperm. The vaginal swabs taken after consensual sex and the evidence collected from rape victims showed a similar pattern of Erase providing superior profiles. Conclusions In all samples tested, STR profiles of the male DNA fractions obtained with Erase were as good as or better than those obtained using the standard methods. PMID:23211019

  2. A task-based parallelism and vectorized approach to 3D Method of Characteristics (MOC) reactor simulation for high performance computing architectures

    Science.gov (United States)

    Tramm, John R.; Gunow, Geoffrey; He, Tim; Smith, Kord S.; Forget, Benoit; Siegel, Andrew R.

    2016-05-01

    In this study we present and analyze a formulation of the 3D Method of Characteristics (MOC) technique applied to the simulation of full core nuclear reactors. Key features of the algorithm include a task-based parallelism model that allows independent MOC tracks to be assigned to threads dynamically, ensuring load balancing, and a wide vectorizable inner loop that takes advantage of modern SIMD computer architectures. The algorithm is implemented in a set of highly optimized proxy applications in order to investigate its performance characteristics on CPU, GPU, and Intel Xeon Phi architectures. Speed, power, and hardware cost efficiencies are compared. Additionally, performance bottlenecks are identified for each architecture in order to determine the prospects for continued scalability of the algorithm on next generation HPC architectures.

  3. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  4. Application of Numerical Optimization Methods to Perform Molecular Docking on Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    M. A. Farkov

    2014-01-01

    Full Text Available An analysis of numerical optimization methods for solving a problem of molecular docking has been performed. Some additional requirements for optimization methods according to GPU architecture features were specified. A promising method for implementation on GPU was selected. Its implementation was described and performance and accuracy tests were performed.

  5. Strategic Approach for Optimizing of Zakah Institution Performance: Customer Relationship Management

    Directory of Open Access Journals (Sweden)

    Estu Widarwati

    2016-12-01

    Full Text Available Zakah is part of the Indonesian economy, which requires the development and structuring. The funds of zakah must be well managed by organizational zakah system which should be improved its performance. Therefore, there is a need of new approach concerning the zakah management based on muzakki’s behavior as an important resource in zakah institution. This paper explores the role of Customer Relationship Management (CRM in zakah institution linked the important of muzakki’s contribution who use services of its. Then it aims to expand the understanding about how CRM as one of strategic approach for organization such zakah institution to improve its performance which employes three main aspect of CRM, which are form of personnel (behavior of personnel, business process, and using technology. Furthermore, this paper tries to depict how CRM is able to raise the zakah funds collection from Moslem society especially Middle Class Moslem in Indonesia by customer (muzakki satisfaction and cost reduction of zakah institution.DOI: 10.15408/aiq.v9i1.4010 

  6. Students’ Approaches to Learning and its Relationship with their Academic Engagement and Qualitative Performance

    Directory of Open Access Journals (Sweden)

    Mohammad Amin Bahrami

    2018-03-01

    Conclusion: Disapproval of the relationship between students’ approaches to learning and their qualitative performance can be attributed to the students’ performance assessment mechanisms. At the same time, due to the heterogeneity in the results of studies in this field, further studies are considered necessary.

  7. The power of simplicity: a fast-and-frugal heuristics approach to performance science.

    Science.gov (United States)

    Raab, Markus; Gigerenzer, Gerd

    2015-01-01

    Performance science is a fairly new multidisciplinary field that integrates performance domains such as sports, medicine, business, and the arts. To give its many branches a structure and its research a direction, it requires a theoretical framework. We demonstrate the applications of this framework with examples from sport and medicine. Because performance science deals mainly with situations of uncertainty rather than known risks, the needed framework can be provided by the fast-and-frugal heuristics approach. According to this approach, experts learn to rely on heuristics in an adaptive way in order to make accurate decisions. We investigate the adaptive use of heuristics in three ways: the descriptive study of the heuristics in the cognitive "adaptive toolbox;" the prescriptive study of their "ecological rationality," that is, the characterization of the situations in which a given heuristic works; and the engineering study of "intuitive design," that is, the design of transparent aids for making better decisions.

  8. The power of simplicity: a fast-and-frugal heuristics approach to performance science

    Science.gov (United States)

    Raab, Markus; Gigerenzer, Gerd

    2015-01-01

    Performance science is a fairly new multidisciplinary field that integrates performance domains such as sports, medicine, business, and the arts. To give its many branches a structure and its research a direction, it requires a theoretical framework. We demonstrate the applications of this framework with examples from sport and medicine. Because performance science deals mainly with situations of uncertainty rather than known risks, the needed framework can be provided by the fast-and-frugal heuristics approach. According to this approach, experts learn to rely on heuristics in an adaptive way in order to make accurate decisions. We investigate the adaptive use of heuristics in three ways: the descriptive study of the heuristics in the cognitive “adaptive toolbox;” the prescriptive study of their “ecological rationality,” that is, the characterization of the situations in which a given heuristic works; and the engineering study of “intuitive design,” that is, the design of transparent aids for making better decisions. PMID:26579051

  9. Investigating the performance of reconstruction methods used in structured illumination microscopy as a function of the illumination pattern's modulation frequency

    Science.gov (United States)

    Shabani, H.; Sánchez-Ortiga, E.; Preza, C.

    2016-03-01

    Surpassing the resolution of optical microscopy defined by the Abbe diffraction limit, while simultaneously achieving optical sectioning, is a challenging problem particularly for live cell imaging of thick samples. Among a few developing techniques, structured illumination microscopy (SIM) addresses this challenge by imposing higher frequency information into the observable frequency band confined by the optical transfer function (OTF) of a conventional microscope either doubling the spatial resolution or filling the missing cone based on the spatial frequency of the pattern when the patterned illumination is two-dimensional. Standard reconstruction methods for SIM decompose the low and high frequency components from the recorded low-resolution images and then combine them to reach a high-resolution image. In contrast, model-based approaches rely on iterative optimization approaches to minimize the error between estimated and forward images. In this paper, we study the performance of both groups of methods by simulating fluorescence microscopy images from different type of objects (ranging from simulated two-point sources to extended objects). These simulations are used to investigate the methods' effectiveness on restoring objects with various types of power spectrum when modulation frequency of the patterned illumination is changing from zero to the incoherent cut-off frequency of the imaging system. Our results show that increasing the amount of imposed information by using a higher modulation frequency of the illumination pattern does not always yield a better restoration performance, which was found to be depended on the underlying object. Results from model-based restoration show performance improvement, quantified by an up to 62% drop in the mean square error compared to standard reconstruction, with increasing modulation frequency. However, we found cases for which results obtained with standard reconstruction methods do not always follow the same trend.

  10. Teaching Psychological Research Methods through a Pragmatic and Programmatic Approach

    Science.gov (United States)

    Rosenkranz, Patrick; Fielden, Amy; Tzemou, Effy

    2014-01-01

    Research methods teaching in psychology is pivotal in preparing students for the transition from student as learner to independent practitioner. We took an action research approach to re-design, implement and evaluate a module guiding students through a programmatic and pragmatic research cycle. These revisions allow students to experience how…

  11. Computation of a coastal protection, using classical method, the PIANC-method or a full probabilistic approach ?

    NARCIS (Netherlands)

    Verhagen, H.J.

    2003-01-01

    In a classical design approach to breakwaters a design wave height is determined, and filled in into a design formula. Some undefined safety is added. In the method using partial safety coefficients (as developed by PIANC [1992] and recently also adopted by the Coastal Engineering Manual of the US

  12. Neutron activation analysis of archaeological artifacts using the conventional relative method: a realistic approach for analysis of large samples

    International Nuclear Information System (INIS)

    Bedregal, P.S.; Mendoza, A.; Montoya, E.H.; Cohen, I.M.; Universidad Tecnologica Nacional, Buenos Aires; Oscar Baltuano

    2012-01-01

    A new approach for analysis of entire potsherds of archaeological interest by INAA, using the conventional relative method, is described. The analytical method proposed involves, primarily, the preparation of replicates of the original archaeological pottery, with well known chemical composition (standard), destined to be irradiated simultaneously, in a well thermalized external neutron beam of the RP-10 reactor, with the original object (sample). The basic advantage of this proposal is to avoid the need of performing complicated effect corrections when dealing with large samples, due to neutron self shielding, neutron self-thermalization and gamma ray attenuation. In addition, and in contrast with the other methods, the main advantages are the possibility of evaluating the uncertainty of the results and, fundamentally, validating the overall methodology. (author)

  13. The performances of R GPU implementations of the GMRES method

    Directory of Open Access Journals (Sweden)

    Bogdan Oancea

    2018-03-01

    Full Text Available Although the performance of commodity computers has improved drastically with the introduction of multicore processors and GPU computing, the standard R distribution is still based on single-threaded model of computation, using only a small fraction of the computational power available now for most desktops and laptops. Modern statistical software packages rely on high performance implementations of the linear algebra routines there are at the core of several important leading edge statistical methods. In this paper we present a GPU implementation of the GMRES iterative method for solving linear systems. We compare the performance of this implementation with a pure single threaded version of the CPU. We also investigate the performance of our implementation using different GPU packages available now for R such as gmatrix, gputools or gpuR which are based on CUDA or OpenCL frameworks.

  14. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  15. Advanced methods in NDE using machine learning approaches

    Science.gov (United States)

    Wunderlich, Christian; Tschöpe, Constanze; Duckhorn, Frank

    2018-04-01

    Machine learning (ML) methods and algorithms have been applied recently with great success in quality control and predictive maintenance. Its goal to build new and/or leverage existing algorithms to learn from training data and give accurate predictions, or to find patterns, particularly with new and unseen similar data, fits perfectly to Non-Destructive Evaluation. The advantages of ML in NDE are obvious in such tasks as pattern recognition in acoustic signals or automated processing of images from X-ray, Ultrasonics or optical methods. Fraunhofer IKTS is using machine learning algorithms in acoustic signal analysis. The approach had been applied to such a variety of tasks in quality assessment. The principal approach is based on acoustic signal processing with a primary and secondary analysis step followed by a cognitive system to create model data. Already in the second analysis steps unsupervised learning algorithms as principal component analysis are used to simplify data structures. In the cognitive part of the software further unsupervised and supervised learning algorithms will be trained. Later the sensor signals from unknown samples can be recognized and classified automatically by the algorithms trained before. Recently the IKTS team was able to transfer the software for signal processing and pattern recognition to a small printed circuit board (PCB). Still, algorithms will be trained on an ordinary PC; however, trained algorithms run on the Digital Signal Processor and the FPGA chip. The identical approach will be used for pattern recognition in image analysis of OCT pictures. Some key requirements have to be fulfilled, however. A sufficiently large set of training data, a high signal-to-noise ratio, and an optimized and exact fixation of components are required. The automated testing can be done subsequently by the machine. By integrating the test data of many components along the value chain further optimization including lifetime and durability

  16. System and Method for Monitoring Piezoelectric Material Performance

    Science.gov (United States)

    Moses, Robert W. (Inventor); Fox, Christopher L. (Inventor); Fox, Melanie L. (Inventor); Chattin, Richard L. (Inventor); Shams, Qamar A. (Inventor); Fox, Robert L. (Inventor)

    2007-01-01

    A system and method are provided for monitoring performance capacity of a piezoelectric material that may form part of an actuator or sensor device. A switch is used to selectively electrically couple an inductor to the piezoelectric material to form an inductor-capacitor circuit. Resonance is induced in the inductor-capacitor circuit when the switch is operated to create the circuit. The resonance of the inductor-capacitor circuit is monitored with the frequency of the resonance being indicative of performance capacity of the device's piezoelectric material.

  17. Peyton’s four-step approach: differential effects of single instructional steps on procedural and memory performance – a clarification study

    Directory of Open Access Journals (Sweden)

    Krautter M

    2015-05-01

    Full Text Available Markus Krautter,1 Ronja Dittrich,2 Annette Safi,2 Justine Krautter,1 Imad Maatouk,2 Andreas Moeltner,2 Wolfgang Herzog,2 Christoph Nikendei2 1Department of Nephrology, 2Department of General Internal and Psychosomatic Medicine, University of Heidelberg Medical Hospital, Heidelberg, Germany Background: Although Peyton’s four-step approach is a widely used method for skills-lab training in undergraduate medical education and has been shown to be more effective than standard instruction, it is unclear whether its superiority can be attributed to a specific single step. Purpose: We conducted a randomized controlled trial to investigate the differential learning outcomes of the separate steps of Peyton’s four-step approach. Methods: Volunteer medical students were randomly assigned to four different groups. Step-1 group received Peyton’s Step 1, Step-2 group received Peyton’s Steps 1 and 2, Step-3 group received Peyton’s Steps 1, 2, and 3, and Step-3mod group received Peyton’s Steps 1 and 2, followed by a repetition of Step 2. Following the training, the first independent performance of a central venous catheter (CVC insertion using a manikin was video-recorded and scored by independent video assessors using binary checklists. The day after the training, memory performance during delayed recall was assessed with an incidental free recall test. Results: A total of 97 participants agreed to participate in the trial. There were no statistically significant group differences with regard to age, sex, completed education in a medical profession, completed medical clerkships, preliminary memory tests, or self-efficacy ratings. Regarding checklist ratings, Step-2 group showed a superior first independent performance of CVC placement compared to Step-1 group (P<0.001, and Step-3 group showed a superior performance to Step-2 group (P<0.009, while Step-2 group and Step-3mod group did not differ (P=0.055. The findings were similar in the incidental

  18. Random spectrum loading of dental implants: An alternative approach to functional performance assessment.

    Science.gov (United States)

    Shemtov-Yona, K; Rittel, D

    2016-09-01

    The fatigue performance of dental implants is usually assessed on the basis of cyclic S/N curves. This neither provides information on the anticipated service performance of the implant, nor does it allow for detailed comparisons between implants unless a thorough statistical analysis is performed, of the kind not currently required by certification standards. The notion of endurance limit is deemed to be of limited applicability, given unavoidable stress concentrations and random load excursions, that all characterize dental implants and their service conditions. We propose a completely different approach, based on random spectrum loading, as long used in aeronautical design. The implant is randomly loaded by a sequence of loads encompassing all load levels it would endure during its service life. This approach provides a quantitative and comparable estimate of its performance in terms of lifetime, based on the very fact that the implant will fracture sooner or later, instead of defining a fatigue endurance limit of limited practical application. Five commercial monolithic Ti-6Al-4V implants were tested under cyclic, and another 5 under spectrum loading conditions, at room temperature and dry air. The failure modes and fracture planes were identical for all implants. The approach is discussed, including its potential applications, for systematic, straightforward and reliable comparisons of various implant designs and environments, without the need for cumbersome statistical analyses. It is believed that spectrum loading can be considered for the generation of new standardization procedures and design applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods

    Science.gov (United States)

    2010-01-01

    Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202

  20. Optimization of cooling tower performance analysis using Taguchi method

    Directory of Open Access Journals (Sweden)

    Ramkumar Ramakrishnan

    2013-01-01

    Full Text Available This study discuss the application of Taguchi method in assessing maximum cooling tower effectiveness for the counter flow cooling tower using expanded wire mesh packing. The experiments were planned based on Taguchi’s L27 orthogonal array .The trail was performed under different inlet conditions of flow rate of water, air and water temperature. Signal-to-noise ratio (S/N analysis, analysis of variance (ANOVA and regression were carried out in order to determine the effects of process parameters on cooling tower effectiveness and to identity optimal factor settings. Finally confirmation tests verified this reliability of Taguchi method for optimization of counter flow cooling tower performance with sufficient accuracy.

  1. The Use of AC-DC-AC Methods in Assessing Corrosion Resistance Performance of Coating Systems for Magnesium Alloys

    Science.gov (United States)

    McCune, Robert C.; Upadhyay, Vinod; Wang, Yar-Ming; Battocchi, Dante

    The potential utility of AC-DC-AC electrochemical methods in comparative measures of corrosion-resisting coating system performance for magnesium alloys under consideration for the USAMP "Magnesium Front End Research and Development" project was previously shown in this forum [1]. Additional studies of this approach using statistically-designed experiments have been conducted with focus on alloy types, pretreatment, topcoat material and topcoat thickness as the variables. Additionally, sample coupons made for these designed experiments were also subjected to a typical automotive cyclic corrosion test cycle (SAE J2334) as well as ASTM B117 for comparison of relative performance. Results of these studies are presented along with advantages and limitations of the proposed methodology.

  2. The multiphonon method as a dynamical approach to octupole correlations in deformed nuclei

    International Nuclear Information System (INIS)

    Piepenbring, R.

    1986-09-01

    The octupole correlations in nuclei are studied within the framework of the multiphonon method which is mainly the exact diagonalization of the total Hamiltonian in the space spanned by collective phonons. This treatment takes properly into account the Pauli principle. It is a microscopic approach based on a reflection symmetry of the potential. The spectroscopic properties of double even and odd-mass nuclei are nicely reproduced. The multiphonon method appears as a dynamical approach to octupole correlations in nuclei which can be compared to other models based on stable octupole deformation. 66 refs

  3. The EDIE method – towards an approach to collaboration-based persuasive design

    DEFF Research Database (Denmark)

    Hansen, Sandra Burri Gram

    2016-01-01

    This paper presents the initial steps towards a collaboration-based method for persuasive design – the EDIE method (Explore, Design, Implement, Evaluate). The method is inspired by Design-Based Research, but developed to combine different design approaches that have dominated the persuasive...... technology field over the past decade. The rhetorical notion of Kairos is considered a key element in the EDIE method, resulting in a distinct focus on participatory design and constructive ethics. The method is explained through a practical example of developing persuasive learning designs in collaboration...

  4. The creation of the climategate hype in blogs and newspapers: mixed methods approach

    NARCIS (Netherlands)

    Hellsten, I.; Vasilieadou, E,

    2015-01-01

    Purpose – Research into the emergence of a hype requires a mixed methods approach that takes into account both the evolution over time andmutual influences across different types of media. The purpose of this paper is to present a methodological approach to detect an emerging hype in online

  5. The creation of the climategate hype in blogs and newspapers : mixed methods approach

    NARCIS (Netherlands)

    Hellsten, I.; Vasileiadou, E.

    2015-01-01

    Purpose – Research into the emergence of a hype requires a mixed methods approach that takes into account both the evolution over time and mutual influences across different types of media. The purpose of this paper is to present a methodological approach to detect an emerging hype in online

  6. Performance of wave function and density functional methods for water hydrogen bond spin-spin coupling constants.

    Science.gov (United States)

    García de la Vega, J M; Omar, S; San Fabián, J

    2017-04-01

    Spin-spin coupling constants in water monomer and dimer have been calculated using several wave function and density functional-based methods. CCSD, MCSCF, and SOPPA wave functions methods yield similar results, specially when an additive approach is used with the MCSCF. Several functionals have been used to analyze their performance with the Jacob's ladder and a set of functionals with different HF exchange were tested. Functionals with large HF exchange appropriately predict 1 J O H , 2 J H H and 2h J O O couplings, while 1h J O H is better calculated with functionals that include a reduced fraction of HF exchange. Accurate functionals for 1 J O H and 2 J H H have been tested in a tetramer water model. The hydrogen bond effects on these intramolecular couplings are additive when they are calculated by SOPPA(CCSD) wave function and DFT methods. Graphical Abstract Evaluation of the additive effect of the hydrogen bond on spin-spin coupling constants of water using WF and DFT methods.

  7. Measurement and Comparison of Variance in the Performance of Algerian Universities using models of Returns to Scale Approach

    Directory of Open Access Journals (Sweden)

    Imane Bebba

    2017-08-01

    Full Text Available This study aimed to measure and compare the performance of forty-seven Algerian universities, using models of returns to Scale approach, which is based primarily on the Data Envelopment Analysis  method. In order to achieve the objective of the study, a set of variables was chosen to represent the dimension of teaching. The variables consisted of three input variables, which were:  the total number of students  in the undergraduate level, students in the post graduate level and the number of permanent professors. On the other hand, the output variable was represented by the total number of students holding degrees of the two levels. Four basic models for data envelopment analysis method were applied. These were: (Scale Returns, represented by input-oriented and output-oriented constant returns and input-oriented and output-oriented  variable returns. After the analysis of data, results revealed that eight universities achieved full efficiency according to constant returns to scale in both input and output orientations. Seventeen universities achieved full efficiency according to the model of input-oriented returns to scale variable. Sixteen universities achieved full efficiency according to the model of output-oriented  returns to scale variable. Therefore, during the performance measurement, the size of the university, competition, financial and infrastructure constraints, and the process of resource allocation within the university  should be taken into consideration. Also, multiple input and output variables reflecting the dimensions of teaching, research, and community service should be included while measuring and assessing the performance of Algerian universities, rather than using two variables which do not reflect the actual performance of these universities. Keywords: Performance of Algerian Universities, Data envelopment analysis method , Constant returns to scale, Variable returns to scale, Input-orientation, Output-orientation.

  8. Stego Keys Performance on Feature Based Coding Method in Text Domain

    Directory of Open Access Journals (Sweden)

    Din Roshidi

    2017-01-01

    Full Text Available A main critical factor on embedding process in any text steganography method is a key used known as stego key. This factor will be influenced the success of the embedding process of text steganography method to hide a message from third party or any adversary. One of the important aspects on embedding process in text steganography method is the fitness performance of the stego key. Three parameters of the fitness performance of the stego key have been identified such as capacity ratio, embedded fitness ratio and saving space ratio. It is because a better as capacity ratio, embedded fitness ratio and saving space ratio offers of any stego key; a more message can be hidden. Therefore, main objective of this paper is to analyze three features coding based namely CALP, VERT and QUAD of stego keys in text steganography on their capacity ratio, embedded fitness ratio and saving space ratio. It is found that CALP method give a good effort performance compared to VERT and QUAD methods.

  9. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    Science.gov (United States)

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  10. Preliminary Axial Flow Turbine Design and Off-Design Performance Analysis Methods for Rotary Wing Aircraft Engines. Part 1; Validation

    Science.gov (United States)

    Chen, Shu-cheng, S.

    2009-01-01

    For the preliminary design and the off-design performance analysis of axial flow turbines, a pair of intermediate level-of-fidelity computer codes, TD2-2 (design; reference 1) and AXOD (off-design; reference 2), are being evaluated for use in turbine design and performance prediction of the modern high performance aircraft engines. TD2-2 employs a streamline curvature method for design, while AXOD approaches the flow analysis with an equal radius-height domain decomposition strategy. Both methods resolve only the flows in the annulus region while modeling the impact introduced by the blade rows. The mathematical formulations and derivations involved in both methods are documented in references 3, 4 for TD2-2) and in reference 5 (for AXOD). The focus of this paper is to discuss the fundamental issues of applicability and compatibility of the two codes as a pair of companion pieces, to perform preliminary design and off-design analysis for modern aircraft engine turbines. Two validation cases for the design and the off-design prediction using TD2-2 and AXOD conducted on two existing high efficiency turbines, developed and tested in the NASA/GE Energy Efficient Engine (GE-E3) Program, the High Pressure Turbine (HPT; two stages, air cooled) and the Low Pressure Turbine (LPT; five stages, un-cooled), are provided in support of the analysis and discussion presented in this paper.

  11. Visual art teachers and performance assessment methods in ...

    African Journals Online (AJOL)

    This paper examines the competencies of visual arts teachers in using performance assessment methods, and to ascertain the extent to which the knowledge, skills and experiences of teachers affect their competence in using assessment strategies in their classroom. The study employs a qualitative research design; ...

  12. Sensitive high performance liquid chromatographic method for the ...

    African Journals Online (AJOL)

    A new simple, sensitive, cost-effective and reproducible high performance liquid chromatographic (HPLC) method for the determination of proguanil (PG) and its metabolites, cycloguanil (CG) and 4-chlorophenylbiguanide (4-CPB) in urine and plasma is described. The extraction procedure is a simple three-step process ...

  13. Integrated plasma control for high performance tokamaks

    International Nuclear Information System (INIS)

    Humphreys, D.A.; Deranian, R.D.; Ferron, J.R.; Johnson, R.D.; LaHaye, R.J.; Leuer, J.A.; Penaflor, B.G.; Walker, M.L.; Welander, A.S.; Jayakumar, R.J.; Makowski, M.A.; Khayrutdinov, R.R.

    2005-01-01

    Sustaining high performance in a tokamak requires controlling many equilibrium shape and profile characteristics simultaneously with high accuracy and reliability, while suppressing a variety of MHD instabilities. Integrated plasma control, the process of designing high-performance tokamak controllers based on validated system response models and confirming their performance in detailed simulations, provides a systematic method for achieving and ensuring good control performance. For present-day devices, this approach can greatly reduce the need for machine time traditionally dedicated to control optimization, and can allow determination of high-reliability controllers prior to ever producing the target equilibrium experimentally. A full set of tools needed for this approach has recently been completed and applied to present-day devices including DIII-D, NSTX and MAST. This approach has proven essential in the design of several next-generation devices including KSTAR, EAST, JT-60SC, and ITER. We describe the method, results of design and simulation tool development, and recent research producing novel approaches to equilibrium and MHD control in DIII-D. (author)

  14. A novel approach to the experimental study on methane/steam reforming kinetics using the Orthogonal Least Squares method

    Science.gov (United States)

    Sciazko, Anna; Komatsu, Yosuke; Brus, Grzegorz; Kimijima, Shinji; Szmyd, Janusz S.

    2014-09-01

    For a mathematical model based on the result of physical measurements, it becomes possible to determine their influence on the final solution and its accuracy. However, in classical approaches, the influence of different model simplifications on the reliability of the obtained results are usually not comprehensively discussed. This paper presents a novel approach to the study of methane/steam reforming kinetics based on an advanced methodology called the Orthogonal Least Squares method. The kinetics of the reforming process published earlier are divergent among themselves. To obtain the most probable values of kinetic parameters and enable direct and objective model verification, an appropriate calculation procedure needs to be proposed. The applied Generalized Least Squares (GLS) method includes all the experimental results into the mathematical model which becomes internally contradicted, as the number of equations is greater than number of unknown variables. The GLS method is adopted to select the most probable values of results and simultaneously determine the uncertainty coupled with all the variables in the system. In this paper, the evaluation of the reaction rate after the pre-determination of the reaction rate, which was made by preliminary calculation based on the obtained experimental results over a Nickel/Yttria-stabilized Zirconia catalyst, was performed.

  15. The Presentation of Self in Letters of Application: A Mixed-Method Approach

    Science.gov (United States)

    Soroko, Emilia

    2012-01-01

    The application letter, as the first phase of employment-seeking, is an opportunity for a job applicant to make a favorable impression on a potential employer. In the current study, the author used a mixed-method approach to empirically explore strategies for self-presentation in job application letters and determine the methods used in the…

  16. Evaluation Method for Low-Temperature Performance of Lithium Battery

    Science.gov (United States)

    Wang, H. W.; Ma, Q.; Fu, Y. L.; Tao, Z. Q.; Xiao, H. Q.; Bai, H.; Bai, H.

    2018-05-01

    In this paper, the evaluation method for low temperature performance of lithium battery is established. The low temperature performance level was set up to determine the best operating temperature range of the lithium battery using different cathode materials. Results are shared with the consumers for the proper use of lithium battery to make it have a longer service life and avoid the occurrence of early rejection.

  17. A Humanistic Approach to Performance-Based Teacher Education. PBTE Series No. 10.

    Science.gov (United States)

    Nash, Paul

    Questions are reaised in making performance-based teacher education (PBTE) a more humanistic enterprise. A definition of the term "humanistic" could include such qualities as freedom, uniqueness, creativity, productivity, wholeness, responsibility, and social humanization. As to freedom, a humanistic approach to PBTE would encourage people to act…

  18. Mixing Methods in Organizational Ethics and Organizational Innovativeness Research : Three Approaches to Mixed Methods Analysis

    OpenAIRE

    Riivari, Elina

    2015-01-01

    This chapter discusses three categories of mixed methods analysis techniques: variableoriented, case-oriented, and process/experience-oriented. All three categories combine qualitative and quantitative approaches to research methodology. The major differences among the categories are the focus of the study, available analysis techniques and timely aspect of the study. In variable-oriented analysis, the study focus is relationships between the research phenomena. In case-oriente...

  19. Roles and methods of performance evaluation of hospital academic leadership.

    Science.gov (United States)

    Zhou, Ying; Yuan, Huikang; Li, Yang; Zhao, Xia; Yi, Lihua

    2016-01-01

    The rapidly advancing implementation of public hospital reform urgently requires the identification and classification of a pool of exceptional medical specialists, corresponding with incentives to attract and retain them, providing a nucleus of distinguished expertise to ensure public hospital preeminence. This paper examines the significance of academic leadership, from a strategic management perspective, including various tools, methods and mechanisms used in the theory and practice of performance evaluation, and employed in the selection, training and appointment of academic leaders. Objective methods of assessing leadership performance are also provided for reference.

  20. A high-performance spatial database based approach for pathology imaging algorithm evaluation.

    Science.gov (United States)

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A D; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J; Saltz, Joel H

    2013-01-01

    Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. WE HAVE CONSIDERED TWO SCENARIOS FOR ALGORITHM EVALUATION: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and

  1. Measuring health system strengthening: application of the balanced scorecard approach to rank the baseline performance of three rural districts in Zambia.

    Science.gov (United States)

    Mutale, Wilbroad; Godfrey-Fausset, Peter; Mwanamwenge, Margaret Tembo; Kasese, Nkatya; Chintu, Namwinga; Balabanova, Dina; Spicer, Neil; Ayles, Helen

    2013-01-01

    There is growing interest in health system performance and recently WHO launched a report on health systems strengthening emphasising the need for close monitoring using system-wide approaches. One recent method is the balanced scorecard system. There is limited application of this method in middle- and low-income countries. This paper applies the concept of balanced scorecard to describe the baseline status of three intervention districts in Zambia. The Better Health Outcome through Mentoring and Assessment (BHOMA) project is a randomised step-wedged community intervention that aims to strengthen the health system in three districts in the Republic of Zambia. To assess the baseline status of the participating districts we used a modified balanced scorecard approach following the domains highlighted in the MOH 2011 Strategic Plan. Differences in performance were noted by district and residence. Finance and service delivery domains performed poorly in all study districts. The proportion of the health workers receiving training in the past 12 months was lowest in Kafue (58%) and highest in Luangwa district (77%). Under service capacity, basic equipment and laboratory capacity scores showed major variation, with Kafue and Luangwa having lower scores when compared to Chongwe. The finance domain showed that Kafue and Chongwe had lower scores (44% and 47% respectively). Regression model showed that children's clinical observation scores were negatively correlated with drug availability (coeff -0.40, p = 0.02). Adult clinical observation scores were positively association with adult service satisfaction score (coeff 0.82, p = 0.04) and service readiness (coeff 0.54, p = 0.03). The study applied the balanced scorecard to describe the baseline status of 42 health facilities in three districts of Zambia. Differences in performance were noted by district and residence in most domains with finance and service delivery performing poorly in all study districts. This tool could

  2. An Intelligent Optimization Method for Vortex-Induced Vibration Reducing and Performance Improving in a Large Francis Turbine

    Directory of Open Access Journals (Sweden)

    Xuanlin Peng

    2017-11-01

    Full Text Available In this paper, a new methodology is proposed to reduce the vortex-induced vibration (VIV and improve the performance of the stay vane in a 200-MW Francis turbine. The process can be divided into two parts. Firstly, a diagnosis method for stay vane vibration based on field experiments and a finite element method (FEM is presented. It is found that the resonance between the Kármán vortex and the stay vane is the main cause for the undesired vibration. Then, we focus on establishing an intelligent optimization model of the stay vane’s trailing edge profile. To this end, an approach combining factorial experiments, extreme learning machine (ELM and particle swarm optimization (PSO is implemented. Three kinds of improved profiles of the stay vane are proposed and compared. Finally, the profile with a Donaldson trailing edge is adopted as the best solution for the stay vane, and verifications such as computational fluid dynamics (CFD simulations, structural analysis and fatigue analysis are performed to validate the optimized geometry.

  3. The comparison of the energy performance of hotel buildings using PROMETHEE decision-making method

    Directory of Open Access Journals (Sweden)

    Vujosevic Milica L.

    2016-01-01

    Full Text Available Annual energy performance of the atrium type hotel buildings in Belgrade climate conditions are analysed in this paper. The objective is to examine the impact of the atrium on the hotel building’s energy needs for space heating and cooling, thus establishing the best design among four proposed alternatives of the hotels with atrium. The energy performance results are obtained using EnergyPlus simulation engine, taking into account Belgrade climate data and thermal comfort parameters. The selected results are compared and the hotels are ranked according to certain criteria. Decision-making process that resulted in the ranking of the proposed alternatives is conducted using PROMETHEE method and Borda model. The methodological approach in this research includes the creation of a hypothetical model of an atrium type hotel building, numerical simulation of energy performances of four design alternatives of the hotel building with an atrium, comparative analysis of the obtained results and ranking of the proposed alternatives from the building’s energy performance perspective. The main task of the analysis is to examine the influence of the atrium, with both its shape and position, on the energy performance of the hotel building. Based on the results of the research it can be to determine the most energy efficient model of the hotel building with atrium for Belgrade climate condition areas. [Projekat Ministarstva nauke Republike Srbije: Spatial, Environmental, Energy and Social aspects of the Developing Settlements and Climate Change - Mutual Impacts

  4. Evaluation method for the drying performance of enzyme containing formulations

    DEFF Research Database (Denmark)

    Sloth, Jakob; Bach, P.; Jensen, Anker Degn

    2008-01-01

    A method is presented for fast and cheap evaluation of the performance of enzyme containing formulations in terms of preserving the highest enzyme activity during spray drying. The method is based on modeling the kinetics of the thermal inactivation reaction which occurs during the drying process....... Relevant kinetic parameters are determined from differential scanning calorimeter (DSC) experiments and the model is used to simulate the severity of the inactivation reaction for temperatures and moisture levels relevant for spray drying. After conducting experiments and subsequent simulations...... for a number of different formulations it may be deduced which formulation performs best. This is illustrated by a formulation design study where 4 different enzyme containing formulations are evaluated. The method is validated by comparison to pilot scale spray dryer experiments....

  5. HANDBOOK OF SOCCER MATCH ANALYSIS: A SYSTEMATIC APPROACH TO IMPROVING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Christopher Carling

    2006-03-01

    Full Text Available DESCRIPTION This book addresses and appropriately explains the soccer match analysis, looks at the very latest in match analysis research, and at the innovative technologies used by professional clubs. This handbook is also bridging the gap between research, theory and practice. The methods in it can be used by coaches, sport scientists and fitness coaches to improve: styles of play, technical ability and physical fitness; objective feedback to players; the development of specific training routines; use of available notation software, video analysis and manual systems; and understanding of current academic research in soccer notational analysis. PURPOSE The aim is to provide a prepared manual on soccer match analysis in general for coaches and sport scientists. Thus, the professionals in this field would gather objective data on the players and the team, which in turn could be used by coaches and players to learn more about performance as a whole and gain a competitive advantage as a result. The book efficiently meets these objectives. AUDIENCE The book is targeted the athlete, the coach, the sports scientist professional or any sport conscious person who wishes to analyze relevant soccer performance. The editors and the contributors are authorities in their respective fields and this handbook depend on their extensive experience and knowledge accumulated over the years. FEATURES The book demonstrates how a notation system can be established to produce data to analyze and improve performance in soccer. It is composed of 9 chapters which present the information in an order that is considered logical and progressive as in most texts. Chapter headings are: 1. Introduction to Soccer Match Analysis, 2. Developing a Manual Notation System, 3. Video and Computerized Match Analysis Technology, 4. General Advice on Analyzing Match Performance, 5. Analysis and Presentation of the Results, 6. Motion Analysis and Consequences for Training, 7. What Match

  6. Organizational Lerning and Strategy: Information Processing Approach of Organizaitonal Learning to Perform Strategic Choice Analysis

    Directory of Open Access Journals (Sweden)

    Agustian Budi Prasetya

    2017-03-01

    Full Text Available Study of organizational learning required to discuss the issue of strategy to understand company’s organizational knowledge and how company applied the organizational knowledge toward the changing of the environment. Method of the analysis for this research was based on desk research thoroughly on the existing literature. This research analyzed the viewpoints of different researchers in organizational learning and elaborates the information processing abilities approach of Organizational Learning (OL. Based on desk research on literature, the research discussed information processing approach to explain organizational learning and strategy choice by describing the importance of information and assumptions, the activities of knowledge acquisition, interpreting and distribution of the knowledge, typology of exploitation and exploration learning. It proposed the importance of the company to perform alignment between internal managerial process arrangement and external environment while doing the learning, based on the strategic choice space, as theatrical clustering map of the learning, the fit, the alignment, and the alliances of the organization. This research finds that the strategic space might help the analysis of balancing between exploitation and exploration learning while applying the analysis of varied firm characteristics, strategic orientation, and industrial environments.

  7. Performance of particle in cell methods on highly concurrent computational architectures

    International Nuclear Information System (INIS)

    Adams, M.F.; Ethier, S.; Wichmann, N.

    2009-01-01

    Particle in cell (PIC) methods are effective in computing Vlasov-Poisson system of equations used in simulations of magnetic fusion plasmas. PIC methods use grid based computations, for solving Poisson's equation or more generally Maxwell's equations, as well as Monte-Carlo type methods to sample the Vlasov equation. The presence of two types of discretizations, deterministic field solves and Monte-Carlo methods for the Vlasov equation, pose challenges in understanding and optimizing performance on today large scale computers which require high levels of concurrency. These challenges arises from the need to optimize two very different types of processes and the interactions between them. Modern cache based high-end computers have very deep memory hierarchies and high degrees of concurrency which must be utilized effectively to achieve good performance. The effective use of these machines requires maximizing concurrency by eliminating serial or redundant work and minimizing global communication. A related issue is minimizing the memory traffic between levels of the memory hierarchy because performance is often limited by the bandwidths and latencies of the memory system. This paper discusses some of the performance issues, particularly in regard to parallelism, of PIC methods. The gyrokinetic toroidal code (GTC) is used for these studies and a new radial grid decomposition is presented and evaluated. Scaling of the code is demonstrated on ITER sized plasmas with up to 16K Cray XT3/4 cores.

  8. Performance of particle in cell methods on highly concurrent computational architectures

    International Nuclear Information System (INIS)

    Adams, M F; Ethier, S; Wichmann, N

    2007-01-01

    Particle in cell (PIC) methods are effective in computing Vlasov-Poisson system of equations used in simulations of magnetic fusion plasmas. PIC methods use grid based computations, for solving Poisson's equation or more generally Maxwell's equations, as well as Monte-Carlo type methods to sample the Vlasov equation. The presence of two types of discretizations, deterministic field solves and Monte-Carlo methods for the Vlasov equation, pose challenges in understanding and optimizing performance on today large scale computers which require high levels of concurrency. These challenges arises from the need to optimize two very different types of processes and the interactions between them. Modern cache based high-end computers have very deep memory hierarchies and high degrees of concurrency which must be utilized effectively to achieve good performance. The effective use of these machines requires maximizing concurrency by eliminating serial or redundant work and minimizing global communication. A related issue is minimizing the memory traffic between levels of the memory hierarchy because performance is often limited by the bandwidths and latencies of the memory system. This paper discusses some of the performance issues, particularly in regard to parallelism, of PIC methods. The gyrokinetic toroidal code (GTC) is used for these studies and a new radial grid decomposition is presented and evaluated. Scaling of the code is demonstrated on ITER sized plasmas with up to 16K Cray XT3/4 cores

  9. Methods of evaluating performance in controlling marketing,activities

    OpenAIRE

    Codruţa Dura

    2002-01-01

    There are specific methods for assessing and improving the effectiveness of a marketing strategy. A marketer should state in the marketing plan what a marketing strategy is supposed to accomplish. These statements should set forth performance standards, which usually are stated in terms of profits, sales, or costs

  10. Implementing a flipped classroom approach in a university numerical methods mathematics course

    Science.gov (United States)

    Johnston, Barbara M.

    2017-05-01

    This paper describes and analyses the implementation of a 'flipped classroom' approach, in an undergraduate mathematics course on numerical methods. The approach replaced all the lecture contents by instructor-made videos and was implemented in the consecutive years 2014 and 2015. The sequential case study presented here begins with an examination of the attitudes of the 2014 cohort to the approach in general as well as analysing their use of the videos. Based on these responses, the instructor makes a number of changes (for example, the use of 'cloze' summary notes and the introduction of an extra, optional tutorial class) before repeating the 'flipped classroom' approach the following year. The attitudes to the approach and the video usage of the 2015 cohort are then compared with the 2014 cohort and further changes that could be implemented for the next cohort are suggested.

  11. Intelligent Knowledge Recommendation Methods for R&D Knowledge Portals

    Institute of Scientific and Technical Information of China (English)

    KIM Jongwoo; LEE Hongjoo; PARK Sungjoo

    2004-01-01

    The personalization in knowledge portals and knowledge management systems is mainly performed based on users' explicitly specified categories and keywords. The explicit specification approach requires users' participation to start personalization services, and has limitation to adapt changes of users' preference. This paper suggests two implicit personalization approaches: automatic user category assignment method and automatic keyword profile generation method. The performances of the implicit personalization approaches are compared with traditional personalization approach using an Internet news site experiment. The result of the experiment shows that the suggested personalization approaches provide sufficient recommendation effectiveness with lessening users'unwanted involvement in personalization process.

  12. Are the new automated methods for bone age estimation advantageous over the manual approaches?

    Science.gov (United States)

    De Sanctis, Vincenzo; Soliman, Ashraf T; Di Maio, Salvatore; Bedair, Said

    2014-12-01

    Bone Age Assessment (BAA) is performed worldwide for the evaluation of endocrine, genetic and chronic diseases, to monitor response to medical therapy and to determine the growth potential of children and adolescents. It is also used for consultation in planning orthopedic procedures, for determination of chronological age for adopted children, youth sports participation and in forensic settings. The main clinical methods for skeletal bone age estimation are the Greulich and Pyle (GP) and the Tanner and Whitehouse (TW) methods. Seventy six per cent (76%) of radiologists or pediatricians usually use the method of GP, 20% that of TW and 4% other methods. The advantages of using the TW method, as opposed to the GP method, are that it overcomes the subjectivity problem and results are more reproducible. However, it is complex and time consuming; for this reason its usage is just about 20% on a world-wide scale. Moreover, there are some evidences that bone age assignments by different physicians can differ significantly. Computerized and Quantitative Ultrasound Technologies (QUS) for assessing skeletal maturity have been developed with the aim of reducing many of the inconsistencies associated with radiographic investigations. In spite of the fact that the volume of automated methods for BAA has increased, the majotity of them are still in an early phase of development. QUS is comparable to the GP based method, but there is not enough established data yet for the healthy population. The Authors wish to stimulate the attention on the accuracy, reliability and consistency of BAA and to initiate a debate on manual versus automated approaches to enhance our assessment for skeletal matutation in children and adolescents.

  13. Fuzzy methods in decision making process - A particular approach in manufacturing systems

    Science.gov (United States)

    Coroiu, A. M.

    2015-11-01

    We are living in a competitive environment, so we can see and understand that the most of manufacturing firms do the best in order to accomplish meeting demand, increasing quality, decreasing costs, and delivery rate. In present a stake point of interest is represented by the development of fuzzy technology. A particular approach for this is represented through the development of methodologies to enhance the ability to managed complicated optimization and decision making aspects involving non-probabilistic uncertainty with the reason to understand, development, and practice the fuzzy technologies to be used in fields such as economic, engineering, management, and societal problems. Fuzzy analysis represents a method for solving problems which are related to uncertainty and vagueness; it is used in multiple areas, such as engineering and has applications in decision making problems, planning and production. As a definition for decision making process we can use the next one: result of mental processes based upon cognitive process with a main role in the selection of a course of action among several alternatives. Every process of decision making can be represented as a result of a final choice and the output can be represented as an action or as an opinion of choice. Different types of uncertainty can be discovered in a wide variety of optimization and decision making problems related to planning and operation of power systems and subsystems. The mixture of the uncertainty factor in the construction of different models serves for increasing their adequacy and, as a result, the reliability and factual efficiency of decisions based on their analysis. Another definition of decision making process which came to illustrate and sustain the necessity of using fuzzy method: the decision making is an approach of choosing a strategy among many different projects in order to achieve some purposes and is formulated as three different models: high risk decision, usual risk

  14. Materials and methods for higher performance screen-printed flexible MRI receive coils.

    Science.gov (United States)

    Corea, Joseph R; Lechene, P Balthazar; Lustig, Michael; Arias, Ana C

    2017-08-01

    To develop methods for characterizing materials used in screen-printed MRI coils and improve signal-to-noise ratio (SNR) with new lower-loss materials. An experimental apparatus was created to characterize dielectric properties of plastic substrates used in receive coils. Coils were fabricated by screen printing conductive ink onto several plastic substrates. Unloaded and sample loaded quality factor (Q Unloaded /Q Loaded ) measurements and scans on a 3T scanner were used to characterize coil performance. An experimental method was developed to describe the relationship between a coil's Q Unloaded and the SNR it provides in images of a phantom. In addition, 3T scans of a phantom and the head of a volunteer were obtained with a proof-of-concept printed eight-channel array, and the results were compared with a commercial 12-channel array. Printed coils with optimized substrates exhibited up to 97% of the image SNR when compared with a traditional coil on a loading phantom. Q Unloaded and the SNR of coils were successfully correlated. The printed array resulted in images comparable to the quality given by the commercial array. Using the proposed methods and materials, the SNR of printed coils approached that of commercial coils while using a new fabrication technique that provided more flexibility and close contact with the patient's body. Magn Reson Med 78:775-783, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  15. A new approach to cost effective projects: High performance project teams

    International Nuclear Information System (INIS)

    Chambers, N.C.

    1994-01-01

    In low oil price environment in which environmental conditions are more challenging, reservoir characteristics less favourable and political risk increasing, successful projects are required in such cases. The present paper deals with the visionary process of establishing high performance project teams. According to the author, such project teams embody dynamic recognition of holism. Holism is achieved as an output from the process of establishing the drivers and enablers for success on a project. They are given birth during the unfolding of the operators development plans and contracting strategy. The paper discusses the main drivers of project teams comprising purpose and performance goals, selection, common approach, commitment and accountability, and financial alignment

  16. A new approach to cost effective projects: High performance project teams

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, N.C. [Brown and Root Energy Services (United Kingdom)

    1994-12-31

    In low oil price environment in which environmental conditions are more challenging, reservoir characteristics less favourable and political risk increasing, successful projects are required in such cases. The present paper deals with the visionary process of establishing high performance project teams. According to the author, such project teams embody dynamic recognition of holism. Holism is achieved as an output from the process of establishing the drivers and enablers for success on a project. They are given birth during the unfolding of the operators development plans and contracting strategy. The paper discusses the main drivers of project teams comprising purpose and performance goals, selection, common approach, commitment and accountability, and financial alignment

  17. Methodical Approaches to Communicative Providing of Retailer Branding

    Directory of Open Access Journals (Sweden)

    Andrey Kataev

    2017-07-01

    Full Text Available The thesis is devoted to the rationalization of methodical approaches for provision of branding of retail trade enterprises. The article considers the features of brand perception by retail consumers and clarifies the specifics of customer reviews of stores for the procedures accompanying brand management. It is proved that besides traditional communication mix, the most important tool of communicative influence on buyers is the store itself as a place for comfortable shopping. The shop should have a stimulating effect on all five human senses, including sight, smell, hearing, touch, and taste, which shall help maximize consumer integration into the buying process.

  18. Distributed and parallel approach for handle and perform huge datasets

    Science.gov (United States)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  19. Examining the Performance of Statistical Downscaling Methods: Toward Matching Applications to Data Products

    Science.gov (United States)

    Dixon, K. W.; Lanzante, J. R.; Adams-Smith, D.

    2017-12-01

    Several challenges exist when seeking to use future climate model projections in a climate impacts study. A not uncommon approach is to utilize climate projection data sets derived from more than one future emissions scenario and from multiple global climate models (GCMs). The range of future climate responses represented in the set is sometimes taken to be indicative of levels of uncertainty in the projections. Yet, GCM outputs are deemed to be unsuitable for direct use in many climate impacts applications. GCM grids typically are viewed as being too coarse. Additionally, regional or local-scale biases in a GCM's simulation of the contemporary climate that may not be problematic from a global climate modeling perspective may be unacceptably large for a climate impacts application. Statistical downscaling (SD) of climate projections - a type of post-processing that uses observations to inform the refinement of GCM projections - is often used in an attempt to account for GCM biases and to provide additional spatial detail. "What downscaled climate projection is the best one to use" is a frequently asked question, but one that is not always easy to answer, as it can be dependent on stakeholder needs and expectations. Here we present results from a perfect model experimental design illustrating how SD method performance can vary not only by SD method, but how performance can also vary by location, season, climate variable of interest, amount of projected climate change, SD configuration choices, and whether one is interested in central tendencies or the tails of the distribution. Awareness of these factors can be helpful when seeking to determine the suitability of downscaled climate projections for specific climate impacts applications. It also points to the potential value of considering more than one SD data product in a study, so as to acknowledge uncertainties associated with the strengths and weaknesses of different downscaling methods.

  20. Budgetary Approach to Project Management by Percentage of Completion Method

    Directory of Open Access Journals (Sweden)

    Leszek Borowiec

    2011-07-01

    Full Text Available Efficient and effective project management process is made possible by the use of methods and techniques of project management. The aim of this paper is to present the problems of project management by using Percentage of Completion method. The research material was gathered based on the experience in implementing this method by the Johnson Controls International Company. The article attempts to demonstrate the validity of the thesis that the POC project management method, allows for effective implementation and monitoring of the project and thus is an effective tool in the managing of companies which exploit the budgetary approach. The study presents planning process of basic parameters affecting the effectiveness of the project (such as costs, revenue, margin and characterized how the primary measurements used to evaluate it. The present theme is illustrating by numerous examples for showing the essence of the raised problems and the results are presenting by using descriptive methods, graphical and tabular.

  1. Towards a Generic Framework for the Performance Evaluation of Manufacturing Strategy: An Innovative Approach

    Directory of Open Access Journals (Sweden)

    Tigist Fetene Adane

    2018-03-01

    Full Text Available To be competitive in a manufacturing environment by providing optimal performance in terms of cost-effectiveness and swiftness of system changes, there is a need for flexible production systems based on a well-defined strategy. Companies are steadily looking for methodology to evaluate, improve and update the performance of manufacturing systems for processing operations. Implementation of an adequate strategy for these systems’ flexibility requires a deep understanding of the intricate interactions between the machining process parameters and the manufacturing system’s operational parameters. This paper proposes a framework/generic model for one of the most common metal cutting operations—the boring process of an engine block machining system. A system dynamics modelling approach is presented for modelling the structure of machining system parameters of the boring process, key performance parameters and their intrinsic relationships. The model is based on a case study performed in a company manufacturing engine blocks for heavy vehicles. The approach could allow for performance evaluation of an engine block manufacturing system condition. The presented model enables a basis for other similar processes and industries producing discrete parts.

  2. On summary measure analysis of linear trend repeated measures data: performance comparison with two competing methods.

    Science.gov (United States)

    Vossoughi, Mehrdad; Ayatollahi, S M T; Towhidi, Mina; Ketabchi, Farzaneh

    2012-03-22

    The summary measure approach (SMA) is sometimes the only applicable tool for the analysis of repeated measurements in medical research, especially when the number of measurements is relatively large. This study aimed to describe techniques based on summary measures for the analysis of linear trend repeated measures data and then to compare performances of SMA, linear mixed model (LMM), and unstructured multivariate approach (UMA). Practical guidelines based on the least squares regression slope and mean of response over time for each subject were provided to test time, group, and interaction effects. Through Monte Carlo simulation studies, the efficacy of SMA vs. LMM and traditional UMA, under different types of covariance structures, was illustrated. All the methods were also employed to analyze two real data examples. Based on the simulation and example results, it was found that the SMA completely dominated the traditional UMA and performed convincingly close to the best-fitting LMM in testing all the effects. However, the LMM was not often robust and led to non-sensible results when the covariance structure for errors was misspecified. The results emphasized discarding the UMA which often yielded extremely conservative inferences as to such data. It was shown that summary measure is a simple, safe and powerful approach in which the loss of efficiency compared to the best-fitting LMM was generally negligible. The SMA is recommended as the first choice to reliably analyze the linear trend data with a moderate to large number of measurements and/or small to moderate sample sizes.

  3. Performance evaluation of websites using entropy and grey relational analysis methods: The case of airline companies

    Directory of Open Access Journals (Sweden)

    Kemal Vatansever

    2017-07-01

    Full Text Available The revolutionary alterations and conversions occurring in information and communication technologies, have triggered an increase in the electronic commerce applications. Airline tickets are one of the most popular items purchased on the internet. The airline websites have become a big distribution channel for the companies to sustain their competitiveness. At this moment, the competition is increasing as airlines try to acquire and retain customers in the airline industry. To acquire and retain customers in such a highly competitive market, it is important for airlines to understand their relative levels of quality in terms of critical elements affecting their competitive advantages. In this study, an integrated two-stage multi-criteria decision-making techniques were used for the measurement of the performance of the airline websites using the Entropy Weight Method and the Grey Relational Analysis approach. The performance of 11 airline companies’ websites operating in Turkey was evaluated in terms of seven criteria. The data of quality website from airlines websites were taken more than 30 trails on various occasions on different periods of times. The data has been taken from 1 December 2016 to 31 December 2016. The weights of the attributes were calculated by Entropy Weight Method, the evaluation of the alternatives using the Grey Relational Analysis method were given ranking of websites.

  4. Development of methods of key performance indicators formation for corporate planning

    International Nuclear Information System (INIS)

    Chebotarev, A.N.

    2011-01-01

    A theoretical proposition, a model of enterprise performance management and a concept of balanced key performance indicators as a method to control the enterprise strategy have been systematized and presented. An algorithm that increases the efficiency of action plans' formation has been developed and implemented. In particular, a set of criteria for the selection of events and parameters necessary for the formation of an action plan has been created. A method of control of the business processes, allowing the experts to establish the relationship between the business processes performance indicators and the enterprise's key indicators has been developed [ru

  5. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  6. Evaluating health worker performance in Benin using the simulated client method with real children

    Directory of Open Access Journals (Sweden)

    Rowe Alexander K

    2012-10-01

    Full Text Available Abstract Background The simulated client (SC method for evaluating health worker performance utilizes surveyors who pose as patients to make surreptitious observations during consultations. Compared to conspicuous observation (CO by surveyors, which is commonly done in developing countries, SC data better reflect usual health worker practices. This information is important because CO can cause performance to be better than usual. Despite this advantage of SCs, the method’s full potential has not been realized for evaluating performance for pediatric illnesses because real children have not been utilized as SCs. Previous SC studies used scenarios of ill children that were not actually brought to health workers. During a trial that evaluated a quality improvement intervention in Benin (the Integrated Management of Childhood Illness [IMCI] strategy, we conducted an SC survey with adult caretakers as surveyors and real children to evaluate the feasibility of this approach and used the results to assess the validity of CO. Methods We conducted an SC survey and a CO survey (one right after the other of health workers in the same 55 health facilities. A detailed description of the SC survey process was produced. Results of the two surveys were compared for 27 performance indicators using logistic regression modeling. Results SC and CO surveyors observed 54 and 185 consultations, respectively. No serious problems occurred during the SC survey. Performance levels measured by CO were moderately higher than those measured by SCs (median CO – SC difference = 16.4 percentage-points. Survey differences were sometimes much greater for IMCI-trained health workers (median difference = 29.7 percentage-points than for workers without IMCI training (median difference = 3.1 percentage-points. Conclusion SC surveys can be done safely with real children if appropriate precautions are taken. CO can introduce moderately large positive biases, and these biases might

  7. Dimensionless Numerical Approaches for the Performance Prediction of Marine Waterjet Propulsion Units

    Directory of Open Access Journals (Sweden)

    Marco Altosole

    2012-01-01

    Full Text Available One of the key issues at early design stage of a high-speed craft is the selection and the performance prediction of the propulsion system because at this stage only few information about the vessel are available. The objective of this work is precisely to provide the designer, in the case of waterjet propelled craft, with a simple and reliable calculation tool, able to predict the waterjet working points in design and off-design conditions, allowing to investigate several propulsive options during the ship design process. In the paper two original dimensionless numerical procedures, one referred to jet units for naval applications and the other more suitable for planing boats, are presented. The first procedure is based on a generalized performance map for mixed flow pumps, derived from the analysis of several waterjet pumps by applying similitude principles of the hydraulic machines. The second approach, validated by some comparisons with current waterjet installations, is based on a complete physical approach, from which a set of non-dimensional waterjet characteristics has been drawn by the authors. The presented application examples show the validity and the degree of accuracy of the proposed methodologies for the performance evaluation of waterjet propulsion systems.

  8. Electrostatic Discharge Current Linear Approach and Circuit Design Method

    Directory of Open Access Journals (Sweden)

    Pavlos K. Katsivelis

    2010-11-01

    Full Text Available The Electrostatic Discharge phenomenon is a great threat to all electronic devices and ICs. An electric charge passing rapidly from a charged body to another can seriously harm the last one. However, there is a lack in a linear mathematical approach which will make it possible to design a circuit capable of producing such a sophisticated current waveform. The commonly accepted Electrostatic Discharge current waveform is the one set by the IEC 61000-4-2. However, the over-simplified circuit included in the same standard is incapable of producing such a waveform. Treating the Electrostatic Discharge current waveform of the IEC 61000-4-2 as reference, an approximation method, based on Prony’s method, is developed and applied in order to obtain a linear system’s response. Considering a known input, a method to design a circuit, able to generate this ESD current waveform in presented. The circuit synthesis assumes ideal active elements. A simulation is carried out using the PSpice software.

  9. An Integrated Approach Using Chaotic Map & Sample Value Difference Method for Electrocardiogram Steganography and OFDM Based Secured Patient Information Transmission.

    Science.gov (United States)

    Pandey, Anukul; Saini, Barjinder Singh; Singh, Butta; Sood, Neetu

    2017-10-18

    This paper presents a patient's confidential data hiding scheme in electrocardiogram (ECG) signal and its subsequent wireless transmission. Patient's confidential data is embedded in ECG (called stego-ECG) using chaotic map and the sample value difference approach. The sample value difference approach effectually hides the patient's confidential data in ECG sample pairs at the predefined locations. The chaotic map generates these predefined locations through the use of selective control parameters. Subsequently, the wireless transmission of the stego-ECG is analyzed using the Orthogonal Frequency Division Multiplexing (OFDM) system in a Rayleigh fading scenario for telemedicine applications. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through the statistical and clinical performance measures. Statistical measures comprise of Percentage Root-mean-square Difference (PRD), Peak Signal to Noise Ratio (PSNR), and Kulback-Leibler Divergence (KL-Div), etc. while clinical metrics includes wavelet Energy Based Diagnostic Distortion (WEDD) and Wavelet based Weighted PRD (WWPRD). The various channel Signal-to-Noise Ratio scenarios are simulated for wireless communication of stego-ECG in OFDM system. The proposed method over all the 48 records of MIT-BIH arrhythmia database resulted in average, PRD = 0.26, PSNR = 55.49, KL-Div = 3.34 × 10 -6 , WEDD = 0.02, and WWPRD = 0.10 with secret data size of 21Kb. Further, a comparative analysis of proposed method and recent existing works was also performed. The results clearly, demonstrated the superiority of proposed method.

  10. A new ART iterative method and a comparison of performance among various ART methods

    International Nuclear Information System (INIS)

    Tan, Yufeng; Sato, Shunsuke

    1993-01-01

    Many algebraic reconstruction techniques (ART) image reconstruction algorithms, for instance, simultaneous iterative reconstruction technique (SIRT), the relaxation method and multiplicative ART (MART), have been proposed and their convergent properties have been studied. SIRT and the underrelaxed relaxation method converge to the least-squares solution, but the convergent speeds are very slow. The Kaczmarz method converges very quickly, but the reconstructed images contain a lot of noise. The comparative studies between these algorithms have been done by Gilbert and others, but are not adequate. In this paper, we (1) propose a new method which is a modified Kaczmarz method and prove its convergence property, (2) study performance of 7 algorithms including the one proposed here by computer simulation for 3 kinds of typical phantoms. The method proposed here does not give the least-square solution, but the root mean square errors of its reconstructed images decrease very quickly after few interations. The result shows that the method proposed here gives a better reconstructed image. (author)

  11. Financial Performance of Pension Companies Operating in Turkey with Topsis Analysis Method

    OpenAIRE

    Gulsun Isseveroglu; Ozan Sezer

    2015-01-01

    In this study, financial performances of the companies were analyzed by TOPSIS method via using financial tables of the sixteen pension and life-pension companies. Firstly, financial ratios which are one of the important indicators for the financial power of companies were determined and calculated for each company separately. Calculated ratios converted to demonstrate of company performance unique point by using TOPSIS method. Companies have sorted according to their calculated performance s...

  12. Performance evaluation of the spectral centroid downshift method for attenuation estimation.

    Science.gov (United States)

    Samimi, Kayvan; Varghese, Tomy

    2015-05-01

    Estimation of frequency-dependent ultrasonic attenuation is an important aspect of tissue characterization. Along with other acoustic parameters studied in quantitative ultrasound, the attenuation coefficient can be used to differentiate normal and pathological tissue. The spectral centroid downshift (CDS) method is one the most common frequencydomain approaches applied to this problem. In this study, a statistical analysis of this method's performance was carried out based on a parametric model of the signal power spectrum in the presence of electronic noise. The parametric model used for the power spectrum of received RF data assumes a Gaussian spectral profile for the transmit pulse, and incorporates effects of attenuation, windowing, and electronic noise. Spectral moments were calculated and used to estimate second-order centroid statistics. A theoretical expression for the variance of a maximum likelihood estimator of attenuation coefficient was derived in terms of the centroid statistics and other model parameters, such as transmit pulse center frequency and bandwidth, RF data window length, SNR, and number of regression points. Theoretically predicted estimation variances were compared with experimentally estimated variances on RF data sets from both computer-simulated and physical tissue-mimicking phantoms. Scan parameter ranges for this study were electronic SNR from 10 to 70 dB, transmit pulse standard deviation from 0.5 to 4.1 MHz, transmit pulse center frequency from 2 to 8 MHz, and data window length from 3 to 17 mm. Acceptable agreement was observed between theoretical predictions and experimentally estimated values with differences smaller than 0.05 dB/cm/MHz across the parameter ranges investigated. This model helps predict the best attenuation estimation variance achievable with the CDS method, in terms of said scan parameters.

  13. French radioactive wastes performance assessment and the natural analogues approach: an overview

    International Nuclear Information System (INIS)

    Escalier des Orres, P.

    1988-10-01

    One of the main difficulties linked to the Radioactive Waste Performance Assessment calculations lies in the scale of time and space underlying these calculations: mechanisms and parameters can directly be affected by time or space dependency. The ''natural analogues'' approach has evident advantages, at least qualitative, to enlighten these aspects. It may also provide confidence in our ability to model partial or overall natural systems. The following paper gives the headlines of the use of the ''natural analogues'' methodology in the French Radioactive Wastes Performance Assessment in the field of waste disposal

  14. Performance of local information-based link prediction: a sampling perspective

    Science.gov (United States)

    Zhao, Jichang; Feng, Xu; Dong, Li; Liang, Xiao; Xu, Ke

    2012-08-01

    Link prediction is pervasively employed to uncover the missing links in the snapshots of real-world networks, which are usually obtained through different kinds of sampling methods. In the previous literature, in order to evaluate the performance of the prediction, known edges in the sampled snapshot are divided into the training set and the probe set randomly, without considering the underlying sampling approaches. However, different sampling methods might lead to different missing links, especially for the biased ways. For this reason, random partition-based evaluation of performance is no longer convincing if we take the sampling method into account. In this paper, we try to re-evaluate the performance of local information-based link predictions through sampling method governed division of the training set and the probe set. It is interesting that we find that for different sampling methods, each prediction approach performs unevenly. Moreover, most of these predictions perform weakly when the sampling method is biased, which indicates that the performance of these methods might have been overestimated in the prior works.

  15. Support and performance improvement for primary health care workers in low- and middle-income countries: a scoping review of intervention design and methods

    Science.gov (United States)

    Mabey, David C.; Chaudhri, Simran; Brown Epstein, Helen-Ann; Lawn, Stephen D.

    2017-01-01

    Abstract Primary health care workers (HCWs) in low- and middle-income settings (LMIC) often work in challenging conditions in remote, rural areas, in isolation from the rest of the health system and particularly specialist care. Much attention has been given to implementation of interventions to support quality and performance improvement for workers in such settings. However, little is known about the design of such initiatives and which approaches predominate, let alone those that are most effective. We aimed for a broad understanding of what distinguishes different approaches to primary HCW support and performance improvement and to clarify the existing evidence as well as gaps in evidence in order to inform decision-making and design of programs intended to support and improve the performance of health workers in these settings. We systematically searched the literature for articles addressing this topic, and undertook a comparative review to document the principal approaches to performance and quality improvement for primary HCWs in LMIC settings. We identified 40 eligible papers reporting on interventions that we categorized into five different approaches: (1) supervision and supportive supervision; (2) mentoring; (3) tools and aids; (4) quality improvement methods, and (5) coaching. The variety of study designs and quality/performance indicators precluded a formal quantitative data synthesis. The most extensive literature was on supervision, but there was little clarity on what defines the most effective approach to the supervision activities themselves, let alone the design and implementation of supervision programs. The mentoring literature was limited, and largely focused on clinical skills building and educational strategies. Further research on how best to incorporate mentorship into pre-service clinical training, while maintaining its function within the routine health system, is needed. There is insufficient evidence to draw conclusions about coaching

  16. See me! A Discussion on the Quality in Performing Arts for Children Based on a Performative Approach

    Directory of Open Access Journals (Sweden)

    Lisa Nagel

    2013-12-01

    Full Text Available In this article, the writer discusses and analyses what happens to our evaluation of quality in performing arts for children when we move from the notion of art as an object to art as an event. Erika Fischer-Lichte´s theory on the so-called performative turn in the arts and more specifically, the term the feedback loop, constitutes the article´s theoretical backdrop. Two audience-related episodes, respectively the dance performance BZz BZz-DADA dA bee by ICB Productions (3 - 6 year olds and the theatre performance Thought Lab by Cirka Teater (for 6 year olds and above, serve as starting points for the theoretical discussion. By adopting Siemke Böhnisch’s performative approach to performance analysis, focusing on the terms henvendthet (directed-ness, the actors´ and spectators´ mutual turning to the other and kontakt (connection in relations to the audience, the writer makes it possible to show a dissonance (and its reverse between the performers and the audience in the two respective performances. The term dissonance describes moments of unintended breaks in communication, moments of which the performers are most likely unaware. These moments however become apparent when the audience´s reactions are included in the analysis. The author concludes that by deferring to a performative perspective, we become almost obliged to consider the child audience as qualified judges of quality, as opposed to allowing ourselves to dismiss their interactions as either noise or enthusiasm. Such a perspective is important not only for how we see and evaluate performing arts for children, but also for how artists must think when producing performances for this audience.

  17. A new multivariate empirical mode decomposition method for improving the performance of SSVEP-based brain-computer interface

    Science.gov (United States)

    Chen, Yi-Feng; Atal, Kiran; Xie, Sheng-Quan; Liu, Quan

    2017-08-01

    Objective. Accurate and efficient detection of steady-state visual evoked potentials (SSVEP) in electroencephalogram (EEG) is essential for the related brain-computer interface (BCI) applications. Approach. Although the canonical correlation analysis (CCA) has been applied extensively and successfully to SSVEP recognition, the spontaneous EEG activities and artifacts that often occur during data recording can deteriorate the recognition performance. Therefore, it is meaningful to extract a few frequency sub-bands of interest to avoid or reduce the influence of unrelated brain activity and artifacts. This paper presents an improved method to detect the frequency component associated with SSVEP using multivariate empirical mode decomposition (MEMD) and CCA (MEMD-CCA). EEG signals from nine healthy volunteers were recorded to evaluate the performance of the proposed method for SSVEP recognition. Main results. We compared our method with CCA and temporally local multivariate synchronization index (TMSI). The results suggest that the MEMD-CCA achieved significantly higher accuracy in contrast to standard CCA and TMSI. It gave the improvements of 1.34%, 3.11%, 3.33%, 10.45%, 15.78%, 18.45%, 15.00% and 14.22% on average over CCA at time windows from 0.5 s to 5 s and 0.55%, 1.56%, 7.78%, 14.67%, 13.67%, 7.33% and 7.78% over TMSI from 0.75 s to 5 s. The method outperformed the filter-based decomposition (FB), empirical mode decomposition (EMD) and wavelet decomposition (WT) based CCA for SSVEP recognition. Significance. The results demonstrate the ability of our proposed MEMD-CCA to improve the performance of SSVEP-based BCI.

  18. Study on behaviors and performances of universal N-glycopeptide enrichment methods.

    Science.gov (United States)

    Xue, Yu; Xie, Juanjuan; Fang, Pan; Yao, Jun; Yan, Guoquan; Shen, Huali; Yang, Pengyuan

    2018-04-16

    Glycosylation is a crucial process in protein biosynthesis. However, the analysis of glycopeptides through MS remains challenging due to the microheterogeneity and macroheterogeneity of the glycoprotein. Selective enrichment of glycopeptides from complex samples prior to MS analysis is essential for successful glycoproteome research. In this work, we systematically investigated the behaviors and performances of boronic acid chemistry, ZIC-HILIC, and PGC of glycopeptide enrichment to promote understanding of these methods. We also optimized boronic acid chemistry and ZIC-HILIC enrichment methods and applied them to enrich glycopeptides from mouse liver. The intact N-glycopeptides were interpreted using the in-house analysis software pGlyco 2.0. We found that boronic acid chemistry in this study preferred to capture glycopeptides with high mannose glycans, ZIC-HILIC enriched most N-glycopeptides and did not show significant preference during enrichment and PGC was not suitable for separating glycopeptides with a long amino acid sequence. We performed a detailed study on the behaviors and performances of boronic acid chemistry, ZIC-HILIC, and PGC enrichment methods and provide a better understanding of enrichment methods for further glycoproteomics research.

  19. Students' Attitudes toward Statistics across the Disciplines: A Mixed-Methods Approach

    Science.gov (United States)

    Griffith, James D.; Adams, Lea T.; Gu, Lucy L.; Hart, Christian L.; Nichols-Whitehead, Penney

    2012-01-01

    Students' attitudes toward statistics were investigated using a mixed-methods approach including a discovery-oriented qualitative methodology among 684 undergraduate students across business, criminal justice, and psychology majors where at least one course in statistics was required. Students were asked about their attitudes toward statistics and…

  20. What Math Matters? Types of Mathematics Knowledge and Relationships to Methods Course Performance

    Science.gov (United States)

    Kajander, Ann; Holm, Jennifer

    2016-01-01

    This study investigated the effect of a departmental focus on enhanced mathematics knowledge for teaching on overall performance in upper elementary mathematics methods courses. The study examined the effect of performance on a new course in mathematics for teaching on performance at the methods course level. In addition, the effect of performance…

  1. The typological approach in child and family psychology: a review of theory, methods, and research.

    Science.gov (United States)

    Mandara, Jelani

    2003-06-01

    The purpose of this paper was to review the theoretical underpinnings, major concepts, and methods of the typological approach. It was argued that the typological approach offers a systematic, empirically rigorous and reliable way to synthesize the nomothetic variable-centered approach with the idiographic case-centered approach. Recent advances in cluster analysis validation make it a promising method for uncovering natural typologies. This paper also reviewed findings from personality and family studies that have revealed 3 prototypical personalities and parenting styles: Adjusted/Authoritative, Overcontrolled/Authoritarian, and Undercontrolled/Permissive. These prototypes are theorized to be synonymous with attractor basins in psychological state space. The connection between family types and personality structure as well as future directions of typological research were also discussed.

  2. Intercomparison of the GOS approach, superposition T-matrix method, and laboratory measurements for black carbon optical properties during aging

    International Nuclear Information System (INIS)

    He, Cenlin; Takano, Yoshi; Liou, Kuo-Nan; Yang, Ping; Li, Qinbin; Mackowski, Daniel W.

    2016-01-01

    We perform a comprehensive intercomparison of the geometric-optics surface-wave (GOS) approach, the superposition T-matrix method, and laboratory measurements for optical properties of fresh and coated/aged black carbon (BC) particles with complex structures. GOS and T-matrix calculations capture the measured optical (i.e., extinction, absorption, and scattering) cross sections of fresh BC aggregates, with 5–20% differences depending on particle size. We find that the T-matrix results tend to be lower than the measurements, due to uncertainty in theoretical approximations of realistic BC structures, particle property measurements, and numerical computations in the method. On the contrary, the GOS results are higher than the measurements (hence the T-matrix results) for BC radii 100 nm. We find good agreement (differences 100 nm. We find small deviations (≤10%) in asymmetry factors computed from the two methods for most BC coating structures and sizes, but several complex structures have 10–30% differences. This study provides the foundation for downstream application of the GOS approach in radiative transfer and climate studies. - Highlights: • The GOS and T-matrix methods capture laboratory measurements of BC optical properties. • The GOS results are consistent with the T-matrix results for BC optical properties. • BC optical properties vary remarkably with coating structures and sizes during aging.

  3. Incorporating Wiki Technology in a Traditional Biostatistics Course: Effects on University Students’ Collaborative Learning, Approaches to Learning and Course Performance

    Directory of Open Access Journals (Sweden)

    Shirley S.M. Fong

    2017-08-01

    Full Text Available Aim/Purpose: To investigate the effectiveness of incorporating wiki technology in an under-graduate biostatistics course for improving university students’ collaborative learning, approaches to learning, and course performance. Methodology: During a three year longitudinal study, twenty-one and twenty-four undergraduate students were recruited by convenience sampling and assigned to a wiki group (2014-2015 and a control group (2013-2014 and 2015-2016, respectively. The students in the wiki group attended face-to-face lectures and used a wiki (PBworks weekly for online- group discussion, and the students in the control group had no access to the wiki and interacted face-to-face only. The students’ collaborative learning, approaches to learning, and course performance were evaluated using the Group Process Questionnaire (GPQ, Revised Study Process Questionnaire (R-SPQ-2F and course results, respectively, after testing. Findings: Multivariate analysis of variance results revealed that the R-SPQ-2F surface approach score, surface motive and strategy subscores were lower in the wiki group than in the control group (p < 0.05. The GPQ individual accountability and equal opportunity scores (components of collaboration were higher in the wiki group than in the control group (p < 0.001. No significant between-groups differences were found in any of the other outcome variables (i.e., overall course result, R-SPQ-2F deep approach score and subscores, GPQ positive interdependence score, social skills score, and composite score. Looking at the Wiki Questionnaire results, the subscale and composite scores we obtained were 31.5% to 37.7% lower than the norm. The wiki was used at a frequency of about 0.7 times per week per student. Recommendations for Practitioners: Using wiki technology in conjunction with the traditional face-to-face teaching method in a biostatistics course can enhance some aspects of undergraduate students’ collaborative learning

  4. Development and validation of a method for the determination of regulated fragrance allergens by High-Performance Liquid Chromatography and Parallel Factor Analysis 2.

    Science.gov (United States)

    Pérez-Outeiral, Jessica; Elcoroaristizabal, Saioa; Amigo, Jose Manuel; Vidal, Maider

    2017-12-01

    This work presents the development and validation of a multivariate method for quantitation of 6 potentially allergenic substances (PAS) related to fragrances by ultrasound-assisted emulsification microextraction coupled with HPLC-DAD and PARAFAC2 in the presence of other 18 PAS. The objective is the extension of a previously proposed univariate method to be able to determine the 24 PAS currently considered as allergens. The suitability of the multivariate approach for the qualitative and quantitative analysis of the analytes is discussed through datasets of increasing complexity, comprising the assessment and validation of the method performance. PARAFAC2 showed to adequately model the data facing up different instrumental and chemical issues, such as co-elution profiles, overlapping spectra, unknown interfering compounds, retention time shifts and baseline drifts. Satisfactory quality parameters of the model performance were obtained (R 2 ≥0.94), as well as meaningful chromatographic and spectral profiles (r≥0.97). Moreover, low errors of prediction in external validation standards (below 15% in most cases) as well as acceptable quantification errors in real spiked samples (recoveries from 82 to 119%) confirmed the suitability of PARAFAC2 for resolution and quantification of the PAS. The combination of the previously proposed univariate approach, for the well-resolved peaks, with the developed multivariate method allows the determination of the 24 regulated PAS. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. On dynamical systems approaches and methods in f ( R ) cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Alho, Artur [Center for Mathematical Analysis, Geometry and Dynamical Systems, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Carloni, Sante [Centro Multidisciplinar de Astrofisica – CENTRA, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Uggla, Claes, E-mail: aalho@math.ist.utl.pt, E-mail: sante.carloni@tecnico.ulisboa.pt, E-mail: claes.uggla@kau.se [Department of Physics, Karlstad University, S-65188 Karlstad (Sweden)

    2016-08-01

    We discuss dynamical systems approaches and methods applied to flat Robertson-Walker models in f ( R )-gravity. We argue that a complete description of the solution space of a model requires a global state space analysis that motivates globally covering state space adapted variables. This is shown explicitly by an illustrative example, f ( R ) = R + α R {sup 2}, α > 0, for which we introduce new regular dynamical systems on global compactly extended state spaces for the Jordan and Einstein frames. This example also allows us to illustrate several local and global dynamical systems techniques involving, e.g., blow ups of nilpotent fixed points, center manifold analysis, averaging, and use of monotone functions. As a result of applying dynamical systems methods to globally state space adapted dynamical systems formulations, we obtain pictures of the entire solution spaces in both the Jordan and the Einstein frames. This shows, e.g., that due to the domain of the conformal transformation between the Jordan and Einstein frames, not all the solutions in the Jordan frame are completely contained in the Einstein frame. We also make comparisons with previous dynamical systems approaches to f ( R ) cosmology and discuss their advantages and disadvantages.

  6. Method of transient identification based on a possibilistic approach, optimized by genetic algorithm

    International Nuclear Information System (INIS)

    Almeida, Jose Carlos Soares de

    2001-02-01

    This work develops a method for transient identification based on a possible approach, optimized by Genetic Algorithm to optimize the number of the centroids of the classes that represent the transients. The basic idea of the proposed method is to optimize the partition of the search space, generating subsets in the classes within a partition, defined as subclasses, whose centroids are able to distinguish the classes with the maximum correct classifications. The interpretation of the subclasses as fuzzy sets and the possible approach provided a heuristic to establish influence zones of the centroids, allowing to achieve the 'don't know' answer for unknown transients, that is, outside the training set. (author)

  7. A Composite Model for Employees' Performance Appraisal and Improvement

    Science.gov (United States)

    Manoharan, T. R.; Muralidharan, C.; Deshmukh, S. G.

    2012-01-01

    Purpose: The purpose of this paper is to develop an innovative method of performance appraisal that will be useful for designing a structured training programme. Design/methodology/approach: Employees' performance appraisals are conducted using new approaches, namely data envelopment analysis and an integrated fuzzy model. Interpretive structural…

  8. Freight performance measures : approach analysis.

    Science.gov (United States)

    2010-05-01

    This report reviews the existing state of the art and also the state of the practice of freight performance measurement. Most performance measures at the state level have aimed at evaluating highway or transit infrastructure performance with an empha...

  9. Comparison of marine spatial planning methods in Madagascar demonstrates value of alternative approaches.

    Directory of Open Access Journals (Sweden)

    Thomas F Allnutt

    Full Text Available The Government of Madagascar plans to increase marine protected area coverage by over one million hectares. To assist this process, we compare four methods for marine spatial planning of Madagascar's west coast. Input data for each method was drawn from the same variables: fishing pressure, exposure to climate change, and biodiversity (habitats, species distributions, biological richness, and biodiversity value. The first method compares visual color classifications of primary variables, the second uses binary combinations of these variables to produce a categorical classification of management actions, the third is a target-based optimization using Marxan, and the fourth is conservation ranking with Zonation. We present results from each method, and compare the latter three approaches for spatial coverage, biodiversity representation, fishing cost and persistence probability. All results included large areas in the north, central, and southern parts of western Madagascar. Achieving 30% representation targets with Marxan required twice the fish catch loss than the categorical method. The categorical classification and Zonation do not consider targets for conservation features. However, when we reduced Marxan targets to 16.3%, matching the representation level of the "strict protection" class of the categorical result, the methods show similar catch losses. The management category portfolio has complete coverage, and presents several management recommendations including strict protection. Zonation produces rapid conservation rankings across large, diverse datasets. Marxan is useful for identifying strict protected areas that meet representation targets, and minimize exposure probabilities for conservation features at low economic cost. We show that methods based on Zonation and a simple combination of variables can produce results comparable to Marxan for species representation and catch losses, demonstrating the value of comparing alternative

  10. Assessing the performance of dispersionless and dispersion-accounting methods: helium interaction with cluster models of the TiO2(110) surface.

    Science.gov (United States)

    de Lara-Castells, María Pilar; Stoll, Hermann; Mitrushchenkov, Alexander O

    2014-08-21

    As a prototypical dispersion-dominated physisorption problem, we analyze here the performance of dispersionless and dispersion-accounting methodologies on the helium interaction with cluster models of the TiO2(110) surface. A special focus has been given to the dispersionless density functional dlDF and the dlDF+Das construction for the total interaction energy (K. Pernal, R. Podeswa, K. Patkowski, and K. Szalewicz, Phys. Rev. Lett. 2009, 109, 263201), where Das is an effective interatomic pairwise functional form for the dispersion. Likewise, the performance of symmetry-adapted perturbation theory (SAPT) method is evaluated, where the interacting monomers are described by density functional theory (DFT) with the dlDF, PBE, and PBE0 functionals. Our benchmarks include CCSD(T)-F12b calculations and comparative analysis on the nuclear bound states supported by the He-cluster potentials. Moreover, intra- and intermonomer correlation contributions to the physisorption interaction are analyzed through the method of increments (H. Stoll, J. Chem. Phys. 1992, 97, 8449) at the CCSD(T) level of theory. This method is further applied in conjunction with a partitioning of the Hartree-Fock interaction energy to estimate individual interaction energy components, comparing them with those obtained using the different SAPT(DFT) approaches. The cluster size evolution of dispersionless and dispersion-accounting energy components is then discussed, revealing the reduced role of the dispersionless interaction and intramonomer correlation when the extended nature of the surface is better accounted for. On the contrary, both post-Hartree-Fock and SAPT(DFT) results clearly demonstrate the high-transferability character of the effective pairwise dispersion interaction whatever the cluster model is. Our contribution also illustrates how the method of increments can be used as a valuable tool not only to achieve the accuracy of CCSD(T) calculations using large cluster models but also to

  11. A different approach to estimate nonlinear regression model using numerical methods

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper concerns with the computational methods namely the Gauss-Newton method, Gradient algorithm methods (Newton-Raphson method, Steepest Descent or Steepest Ascent algorithm method, the Method of Scoring, the Method of Quadratic Hill-Climbing) based on numerical analysis to estimate parameters of nonlinear regression model in a very different way. Principles of matrix calculus have been used to discuss the Gradient-Algorithm methods. Yonathan Bard [1] discussed a comparison of gradient methods for the solution of nonlinear parameter estimation problems. However this article discusses an analytical approach to the gradient algorithm methods in a different way. This paper describes a new iterative technique namely Gauss-Newton method which differs from the iterative technique proposed by Gorden K. Smyth [2]. Hans Georg Bock et.al [10] proposed numerical methods for parameter estimation in DAE’s (Differential algebraic equation). Isabel Reis Dos Santos et al [11], Introduced weighted least squares procedure for estimating the unknown parameters of a nonlinear regression metamodel. For large-scale non smooth convex minimization the Hager and Zhang (HZ) conjugate gradient Method and the modified HZ (MHZ) method were presented by Gonglin Yuan et al [12].

  12. A geologic approach to field methods in fluvial geomorphology

    Science.gov (United States)

    Fitzpatrick, Faith A.; Thornbush, Mary J; Allen, Casey D; Fitzpatrick, Faith A.

    2014-01-01

    A geologic approach to field methods in fluvial geomorphology is useful for understanding causes and consequences of past, present, and possible future perturbations in river behavior and floodplain dynamics. Field methods include characterizing river planform and morphology changes and floodplain sedimentary sequences over long periods of time along a longitudinal river continuum. Techniques include topographic and bathymetric surveying of fluvial landforms in valley bottoms and describing floodplain sedimentary sequences through coring, trenching, and examining pits and exposures. Historical sediment budgets that include floodplain sedimentary records can characterize past and present sources and sinks of sediment along a longitudinal river continuum. Describing paleochannels and floodplain vertical accretion deposits, estimating long-term sedimentation rates, and constructing historical sediment budgets can assist in management of aquatic resources, habitat, sedimentation, and flooding issues.

  13. Used Nuclear Fuel Loading and Structural Performance Under Normal Conditions of Transport- Demonstration of Approach and Results on Used Fuel Performance Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Adkins, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Geelhood, Ken [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Koeppel, Brian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bignell, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flores, Gregg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wang, Jy-An [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanborn, Scott [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Spears, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States); Klymyshyn, Nick [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-30

    This document addresses Oak Ridge National Laboratory milestone M2FT-13OR0822015 Demonstration of Approach and Results on Used Nuclear Fuel Performance Characterization. This report provides results of the initial demonstration of the modeling capability developed to perform preliminary deterministic evaluations of moderate-to-high burnup used nuclear fuel (UNF) mechanical performance under normal conditions of storage (NCS) and normal conditions of transport (NCT) conditions. This report also provides results from the sensitivity studies that have been performed. Finally, discussion on the long-term goals and objectives of this initiative are provided.

  14. Description of JNC's analytical method and its performance for FBR cores

    International Nuclear Information System (INIS)

    Ishikawa, M.

    2000-01-01

    The description of JNC's analytical method and its performance for FBR cores includes: an outline of JNC's Analytical System Compared with ERANOS; a standard data base for FBR Nuclear Design in JNC; JUPITER Critical Experiment; details of Analytical Method and Its Effects on JUPITER; performance of JNC Analytical System (effective multiplication factor k eff , control rod worth, and sodium void reactivity); design accuracy of a 600 MWe-class FBR Core. JNC developed a consistent analytical system for FBR core evaluation, based on JENDL library, f-table method, and three dimensional diffusion/transport theory, which includes comprehensive sensitivity tools to improve the prediction accuracy of core parameters. JNC system was verified by analysis of JUPITER critical experiment, and other facilities. Its performance can be judged quite satisfactory for FBR-core design work, though there is room for further improvement, such as more detailed treatment of cross-section resonance regions

  15. An exploratory survey of methods used to develop measures of performance

    Science.gov (United States)

    Hamner, Kenneth L.; Lafleur, Charles A.

    1993-09-01

    Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.

  16. The Case for Medieval Drama in the Classroom: An Approach through Performance.

    Science.gov (United States)

    Lieblein, Leanore; Pare, Anthony

    1983-01-01

    Argues that medieval drama in performance suggests a number of important issues about the nature of literature, particularly about the way narrative and dramatic art can express the life of a community. Presents a series of exercises that start with familiar, nonthreatening situations in order to approach the richness of medieval plays and the…

  17. A Facile Approach to Evaluate Thermal Insulation Performance of Paper Cups

    Directory of Open Access Journals (Sweden)

    Yudi Kuang

    2015-01-01

    Full Text Available Paper cups are ubiquitous in daily life for serving water, soup, coffee, tea, and milk due to their convenience, biodegradability, recyclability, and sustainability. The thermal insulation performance of paper cups is of significance because they are used to supply hot food or drinks. Using an effective thermal conductivity to accurately evaluate the thermal insulation performance of paper cups is complex due to the inclusion of complicated components and a multilayer structure. Moreover, an effective thermal conductivity is unsuitable for evaluating thermal insulation performance of paper cups in the case of fluctuating temperature. In this work, we propose a facile approach to precisely analyze the thermal insulation performance of paper cups in a particular range of temperature by using an evaluation model based on the MISO (Multiple-Input Single-Output technical theory, which includes a characterization parameter (temperature factor and a measurement apparatus. A series of experiments was conducted according to this evaluation model, and the results show that this evaluation model enables accurate characterization of the thermal insulation performance of paper cups and provides an efficient theoretical basis for selecting paper materials for paper cups.

  18. Performance of the Lot Quality Assurance Sampling Method Compared to Surveillance for Identifying Inadequately-performing Areas in Matlab, Bangladesh

    OpenAIRE

    Bhuiya, Abbas; Hanifi, S.M.A.; Roy, Nikhil; Streatfield, P. Kim

    2007-01-01

    This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers i...

  19. Total error components - isolation of laboratory variation from method performance

    International Nuclear Information System (INIS)

    Bottrell, D.; Bleyler, R.; Fisk, J.; Hiatt, M.

    1992-01-01

    The consideration of total error across sampling and analytical components of environmental measurements is relatively recent. The U.S. Environmental Protection Agency (EPA), through the Contract Laboratory Program (CLP), provides complete analyses and documented reports on approximately 70,000 samples per year. The quality assurance (QA) functions of the CLP procedures provide an ideal data base-CLP Automated Results Data Base (CARD)-to evaluate program performance relative to quality control (QC) criteria and to evaluate the analysis of blind samples. Repetitive analyses of blind samples within each participating laboratory provide a mechanism to separate laboratory and method performance. Isolation of error sources is necessary to identify effective options to establish performance expectations, and to improve procedures. In addition, optimized method performance is necessary to identify significant effects that result from the selection among alternative procedures in the data collection process (e.g., sampling device, storage container, mode of sample transit, etc.). This information is necessary to evaluate data quality; to understand overall quality; and to provide appropriate, cost-effective information required to support a specific decision

  20. Resampling Approach for Determination of the Method for Reference Interval Calculation in Clinical Laboratory Practice▿

    Science.gov (United States)

    Pavlov, Igor Y.; Wilson, Andrew R.; Delgado, Julio C.

    2010-01-01

    Reference intervals (RI) play a key role in clinical interpretation of laboratory test results. Numerous articles are devoted to analyzing and discussing various methods of RI determination. The two most widely used approaches are the parametric method, which assumes data normality, and a nonparametric, rank-based procedure. The decision about which method to use is usually made arbitrarily. The goal of this study was to demonstrate that using a resampling approach for the comparison of RI determination techniques could help researchers select the right procedure. Three methods of RI calculation—parametric, transformed parametric, and quantile-based bootstrapping—were applied to multiple random samples drawn from 81 values of complement factor B observations and from a computer-simulated normally distributed population. It was shown that differences in RI between legitimate methods could be up to 20% and even more. The transformed parametric method was found to be the best method for the calculation of RI of non-normally distributed factor B estimations, producing an unbiased RI and the lowest confidence limits and interquartile ranges. For a simulated Gaussian population, parametric calculations, as expected, were the best; quantile-based bootstrapping produced biased results at low sample sizes, and the transformed parametric method generated heavily biased RI. The resampling approach could help compare different RI calculation methods. An algorithm showing a resampling procedure for choosing the appropriate method for RI calculations is included. PMID:20554803