WorldWideScience

Sample records for processing methodologies result

  1. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  2. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  3. A Design Methodology for Medical Processes

    Science.gov (United States)

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  4. A methodology to describe process control requirements

    International Nuclear Information System (INIS)

    Carcagno, R.; Ganni, V.

    1994-01-01

    This paper presents a methodology to describe process control requirements for helium refrigeration plants. The SSC requires a greater level of automation for its refrigeration plants than is common in the cryogenics industry, and traditional methods (e.g., written descriptions) used to describe process control requirements are not sufficient. The methodology presented in this paper employs tabular and graphic representations in addition to written descriptions. The resulting document constitutes a tool for efficient communication among the different people involved in the design, development, operation, and maintenance of the control system. The methodology is not limited to helium refrigeration plants, and can be applied to any process with similar requirements. The paper includes examples

  5. A Design Methodology for Medical Processes.

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  6. Phenomena based Methodology for Process Synthesis incorporating Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip; Babi, Deenesh Kavi; Woodley, John

    2013-01-01

    at processes at the lowest level of aggregation which is the phenomena level. In this paper, a phenomena based synthesis/design methodology incorporating process intensification is presented. Using this methodology, a systematic identification of necessary and desirable (integrated) phenomena as well......Process intensification (PI) has the potential to improve existing as well as conceptual processes, in order to achieve a more sustainable production. PI can be achieved at different levels. That is, the unit operations, functional and/or phenomena level. The highest impact is expected by looking...... as generation and screening of phenomena based flowsheet options are presented using a decomposition based solution approach. The developed methodology as well as necessary tools and supporting methods are highlighted through a case study involving the production of isopropyl-acetate....

  7. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    are available. The first case study presents a rational approach for defining a development strategy for multi-enzymatic processes. The proposed methodology requires a profound and structured knowledge of the multi-enzyme systems, integrating chemistry, biological and process engineering. In order to suggest......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... and their relationship with the overall process is not clear.The work described in this thesis presents a methodological approach for early stage development of biocatalytic processes, understanding and dealing with the reaction, biocatalyst and process constraints. When applied, this methodology has a decisive role...

  8. Evaluation methodology based on physical security assessment results: a utility theory approach

    International Nuclear Information System (INIS)

    Bennett, H.A.; Olascoaga, M.T.

    1978-03-01

    This report describes an evaluation methodology which aggregates physical security assessment results for nuclear facilities into an overall measure of adequacy. This methodology utilizes utility theory and conforms to a hierarchical structure developed by the NRC. Implementation of the methodology is illustrated by several examples. Recommendations for improvements in the evaluation process are given

  9. Resection methodology for PSP data processing: Recent ...

    Indian Academy of Sciences (India)

    M. Senthilkumar (Newgen Imaging) 1461 1996 Oct 15 13:05:22

    Abstract. PSP data processing, which primarily involves image alignment and image analysis, is a crucial element in obtaining accurate PSP results. There are two broad approaches to image alignment: the algebraic transformation technique, often called image-warping technique, and resection methodology, which uses ...

  10. WORK ALLOCATION IN COMPLEX PRODUCTION PROCESSES: A METHODOLOGY FOR DECISION SUPPORT

    OpenAIRE

    de Mello, Adriana Marotti; School of Economics, Business and Accounting at the University of São Paulo; Marx, Roberto; Polytechnic School, University of São Paulo; Zilbovicius, Mauro; Polytechnic School – University of São Paulo

    2013-01-01

    This article presents the development of a Methodology of Decision Support for Work Allocation in complex production processes. It is known that this decision is frequently taken empirically and that the methodologies available to support it are few and restricted in terms of its conceptual basis. The study of Times and Motion is one of these methodologies, but its applicability is restricted in cases of more complex production processes. The method presented here was developed as a result of...

  11. Digital Methodology to implement the ECOUTER engagement process.

    Science.gov (United States)

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E  in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  12. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    OpenAIRE

    Marko Hadjina; Nikša Fafandjel; Tin Matulja

    2015-01-01

    In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of sugge...

  13. Digital processing methodology applied to exploring of radiological images

    International Nuclear Information System (INIS)

    Oliveira, Cristiane de Queiroz

    2004-01-01

    In this work, digital image processing is applied as a automatic computational method, aimed for exploring of radiological images. It was developed an automatic routine, from the segmentation and post-processing techniques to the radiology images acquired from an arrangement, consisting of a X-ray tube, target and filter of molybdenum, of 0.4 mm and 0.03 mm, respectively, and CCD detector. The efficiency of the methodology developed is showed in this work, through a case study, where internal injuries in mangoes are automatically detected and monitored. This methodology is a possible tool to be introduced in the post-harvest process in packing houses. A dichotomic test was applied to evaluate a efficiency of the method. The results show a success of 87.7% to correct diagnosis and 12.3% to failures to correct diagnosis with a sensibility of 93% and specificity of 80%. (author)

  14. Methodology of evaluation of value created in the productive processes

    OpenAIRE

    M.T. Roszak

    2008-01-01

    Purpose: Of this paper was to present the methodology of analysis of the productive processes with applicationof value analysis and multi-criterion-analysis which allow to evaluate the technology and organization of theproductive processes.Design/methodology/approach: Presented in the paper methodology of evaluation of the productive processesis based on analysis of activities in the productive processes and their characteristics with reference to createdvalue in the productive chain.Findings...

  15. Methodology of Fault Diagnosis in Ductile Iron Melting Process

    Directory of Open Access Journals (Sweden)

    Perzyk M.

    2016-12-01

    Full Text Available Statistical Process Control (SPC based on the Shewhart’s type control charts, is widely used in contemporary manufacturing industry, including many foundries. The main steps include process monitoring, detection the out-of-control signals, identification and removal of their causes. Finding the root causes of the process faults is often a difficult task and can be supported by various tools, including data-driven mathematical models. In the present paper a novel approach to statistical control of ductile iron melting process is proposed. It is aimed at development of methodologies suitable for effective finding the causes of the out-of-control signals in the process outputs, defined as ultimate tensile strength (Rm and elongation (A5, based mainly on chemical composition of the alloy. The methodologies are tested and presented using several real foundry data sets. First, correlations between standard abnormal output patterns (i.e. out-of-control signals and corresponding inputs patterns are found, basing on the detection of similar patterns and similar shapes of the run charts of the chemical elements contents. It was found that in a significant number of cases there was no clear indication of the correlation, which can be attributed either to the complex, simultaneous action of several chemical elements or to the causes related to other process variables, including melting, inoculation, spheroidization and pouring parameters as well as the human errors. A conception of the methodology based on simulation of the process using advanced input - output regression modelling is presented. The preliminary tests have showed that it can be a useful tool in the process control and is worth further development. The results obtained in the present study may not only be applied to the ductile iron process but they can be also utilized in statistical quality control of a wide range of different discrete processes.

  16. A prioritization methodology to strategic planning process

    International Nuclear Information System (INIS)

    Rondinelli Junior, Francisco; Cherif, Hadj Slimane

    2009-01-01

    In the process of formulation of a Strategic Plan, there is always a step that deals with choices among different options and strategies. To do that a prioritization methodology has to be applied in order to achieve the higher needs identified along the analysis and evaluation of problems. To assign priorities within a set of needs/problems of a strategic nature and identified within various areas of activity or different sectors, it is proposed a methodology that envisage the use of specific attributes for which a graded scale of values is established for each need/problem, which, at the end of the process, allows a quantitative comparison among them. The methodology presented in this paper was developed following an approach that has been used in many areas over the last 20 years by various public and private institutions, and also by international organizations involved in promotion and development work. (author)

  17. A methodology for quantitatively managing the bug fixing process using Mahalanobis Taguchi system

    Directory of Open Access Journals (Sweden)

    Boby John

    2015-12-01

    Full Text Available The controlling of bug fixing process during the system testing phase of software development life cycle is very important for fixing all the detected bugs within the scheduled time. The presence of open bugs often delays the release of the software or result in releasing the software with compromised functionalities. These can lead to customer dissatisfaction, cost overrun and eventually the loss of market share. In this paper, the authors propose a methodology to quantitatively manage the bug fixing process during system testing. The proposed methodology identifies the critical milestones in the system testing phase which differentiates the successful projects from the unsuccessful ones using Mahalanobis Taguchi system. Then a model is developed to predict whether a project is successful or not with the bug fix progress at critical milestones as control factors. Finally the model is used to control the bug fixing process. It is found that the performance of the proposed methodology using Mahalanobis Taguchi system is superior to the models developed using other multi-dimensional pattern recognition techniques. The proposed methodology also reduces the number of control points providing the managers with more options and flexibility to utilize the bug fixing resources across system testing phase. Moreover the methodology allows the mangers to carry out mid- course corrections to bring the bug fixing process back on track so that all the detected bugs can be fixed on time. The methodology is validated with eight new projects and the results are very encouraging.

  18. Soft Systems Methodology Embedded in Organizational Knowledge-Creating Process

    OpenAIRE

    Yoshida, Taketoshi

    2005-01-01

    We clarify the role of tacit knowing in the soft systems methodology. For the purpose we investigate the basic structure of its seven-stage model, while embedding the soft systems methodology in organizational knowledge-creating process. This leads to the introduction of concept creation to the methodology. This changes the basic shape of the soft systems methodology from learning cycle to organizational knowledge-creating spiral where concept creation is the key point.

  19. An optimization methodology for identifying robust process integration investments under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden); Patriksson, Michael [Department of Mathematical Sciences, Chalmers University of Technology and Department of Mathematical Sciences, University of Gothenburg, SE-412 96 Goeteborg (Sweden)

    2009-02-15

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  20. An optimization methodology for identifying robust process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith; Patriksson, Michael

    2009-01-01

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  1. Methodology for the evaluation process in the director´s preparation from education.

    Directory of Open Access Journals (Sweden)

    Humberto Clemente Calderón Echevarría

    2014-03-01

    Full Text Available The presented work proposes a methodology oriented to the evaluation of the process of director´s preparation which may contribute to the improvement of the program. It explains the need of the evaluation, activity as such, indicators to evaluate, methods and techniques to be used and the steps in which have to be done. Until now doesn´t exist a methodology which can evaluate the process of directors’ preparation in the educational sector. The development of this methodology has as a background the result obtained by means of different investigations made in the Provincial Post Office and that later were applied in the Provincial Department of Education. Nowadays is perfected in the Pedagogical University "Capitán Silverio Blanco Núñez", thus, the employment opportunities in similar processes of other entities. In the proposed methodology is conceived that the process evaluation of the director’s preparation flows out in a cyclical manner, continuous, flexible, and interactive, away from the traditional linear formula, rigid and schematic. From the above idea it can be identify four stages, and the relevant procedures, the evaluation of the process of directors’ preparation in education.

  2. Department of Energy's process waste assessment graded approach methodology

    International Nuclear Information System (INIS)

    Pemberton, S.E.

    1994-03-01

    As the initial phase of the formal waste minimization program, the Department of Energy requires assessments of all its waste-generating operations. These assessments, called process waste assessments (PWAs), are a tool which helps achieve the pollution prevention goals. The DOE complex is comprised of numerous sites located in many different states. The facilities as a whole represent a tremendous diversity of technologies, processes, and activities. Due to this diversity, there are also a wide variety and number of waste streams generated. Many of these waste streams are small, intermittent, and not of consistent composition. The PWA graded approach methodology addresses these complexities and recognizes that processes vary in the quantity of pollution they generate, as well as in the perceived risk and associated hazards. Therefore, the graded approach was developed to provide a cost-effective and flexible methodology which allows individual sites to prioritize their local concerns and align their efforts with the resources allocated. This presentation will describe a project sponsored by the DOE Office of Environmental Restoration and Waste Management, Waste Minimization Division, which developed a graded approach methodology for use throughout the DOE. This methodology was initiated in FY93 through a combined effort of the following DOE/Defense Program sites: Kansas City Plant, Lawrence Livermore National Laboratory, Los Alamos National Laboratory, Sandia National Laboratories. This presentation will describe the process waste assessment tool, benefits achieved through the completion of PWAs, DOE's graded approach methodology, and an update on the project's current status

  3. A Generic Methodology for Superstructure Optimization of Different Processing Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Zhang, Lei

    2016-01-01

    In this paper, we propose a generic computer-aided methodology for synthesis of different processing networks using superstructure optimization. The methodology can handle different network optimization problems of various application fields. It integrates databases with a common data architecture......, a generic model to represent the processing steps, and appropriate optimization tools. A special software interface has been created to automate the steps in the methodology workflow, allow the transfer of data between tools and obtain the mathematical representation of the problem as required...

  4. Intersystem LOCA risk assessment: methodology and results

    International Nuclear Information System (INIS)

    Galyean, W.J.; Kelly, D.L.; Schroeder, J.A.; Auflick, L.J.; Blackman, H.S.; Gertman, D.I.; Hanley, L.N.

    1994-01-01

    The United States Nuclear Regulatory Commission is sponsoring a research program to develop an improved understanding of the human factors, hardware and accident consequence issues that dominate the risk from an intersystem loss-of-coolant accident (ISLOCA) at a nuclear power plant. To accomplish the goals of this program, a mehtodology has been developed for estimating ISLOCA core damage frequency and risk. The steps in this methodology are briefly described, along with the results obtained from an application of the methodology at three pressurized water reactors. Also included are the results of a screening study of boiling water reactors. ((orig.))

  5. Methodological Strategies for Studying the Process of Learning, Memory and Visual Literacy.

    Science.gov (United States)

    Randhawa, Bikkar S.; Hunt, Dennis

    An attempt is made to discuss current models of information processing, learning, and development, thereby suggesting adequate methodological strategies for research in visual literacy. It is maintained that development is a cumulative process of learning, and that learning and memory are the result of new knowledge, sensations, etc. over a short…

  6. The evaluation framework for business process management methodologies

    Directory of Open Access Journals (Sweden)

    Sebastian Lahajnar

    2016-06-01

    Full Text Available In an intense competition in the global market, organisations seek to take advantage of all their internal and external potentials, advantages, and resources. It has been found that, in addition to competitive products and services, a good business also requires an effective management of business processes, which is the discipline of the business process management (BPM. The introduction of the BPM in the organisation requires a thoughtful selection of an appropriate methodological approach, since the latter will formalize activities, products, applications and other efforts of the organisation in this field. Despite many technology-driven solutions of software companies, recommendations of consulting companies, techniques, good practices and tools, the decision on what methodology to choose is anything but simple. The aim of this article is to simplify the adoption of such decisions by building a framework for the evaluation of BPM methodologies according to a qualitative multi-attribute decision-making method. The framework defines a hierarchical decision-making model, formalizes the decision-making process and thus contributes significantly to an independent, credible final decision that is the most appropriate for a specific organisation.

  7. Navigating the Process of Ethical Approval: A methodological note

    Directory of Open Access Journals (Sweden)

    Eileen Carey, RNID, BSc. (hons, MSc.

    2010-12-01

    Full Text Available Classic grounded theory (CGT methodology is a general methodology whereby the researcher aims to develop an emergent conceptual theory from empirical data collected by the researcher during the research study. Gaining ethical approval from relevant ethics committees to access such data is the starting point for processing a CGT study. The adoption of the Universal Declaration on Bioethics and Human Rights (UNESCO, 2005 is an indication of global consensus on the importance of research ethics. There is, however, a wide variation of health research systems across countries and disciplines (Hearnshaw 2004. Institutional Research Boards (IRB or Research Ethics Committees (REC have been established in many countries to regulate ethical research ensuring that researchers agree to, and adhere to, specific ethical and methodological conditions prior to ethical approval being granted. Interestingly, both the processes and outcomes through which the methodological aspects pertinent to CGT studies are agreed between the researcher and ethics committee remain largely ambiguous and vague. Therefore, meeting the requirements for ethical approval from ethics committees, while enlisting the CGT methodology as a chosen research approach, can be daunting for novice researchers embarking upon their first CGT study.

  8. Methodological Approach for Modeling of Multienzyme in-pot Processes

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia; Roman Martinez, Alicia; Sin, Gürkan

    2011-01-01

    This paper presents a methodological approach for modeling multi-enzyme in-pot processes. The methodology is exemplified stepwise through the bi-enzymatic production of N-acetyl-D-neuraminic acid (Neu5Ac) from N-acetyl-D-glucosamine (GlcNAc). In this case study, sensitivity analysis is also used ...

  9. [Improving inpatient pharmacoterapeutic process by Lean Six Sigma methodology].

    Science.gov (United States)

    Font Noguera, I; Fernández Megía, M J; Ferrer Riquelme, A J; Balasch I Parisi, S; Edo Solsona, M D; Poveda Andres, J L

    2013-01-01

    Lean Six Sigma methodology has been used to improve care processes, eliminate waste, reduce costs, and increase patient satisfaction. To analyse the results obtained with Lean Six Sigma methodology in the diagnosis and improvement of the inpatient pharmacotherapy process during structural and organisational changes in a tertiary hospital. 1.000 beds tertiary hospital. prospective observational study. The define, measure, analyse, improve and control (DMAIC), were deployed from March to September 2011. An Initial Project Charter was updated as results were obtained. 131 patients with treatments prescribed within 24h after admission and with 4 drugs. safety indicators (medication errors), and efficiency indicators (complaints and time delays). Proportion of patients with a medication error was reduced from 61.0% (25/41 patients) to 55.7% (39/70 patients) in four months. Percentage of errors (regarding the opportunities for error) decreased in the different phases of the process: Prescription: from 5.1% (19/372 opportunities) to 3.3% (19/572 opportunities); Preparation: from 2.7% (14/525 opportunities) to 1.3% (11/847 opportunities); and administration: from 4.9% (16/329 opportunities) to 3.0% (13/433 opportunities). Nursing complaints decreased from 10.0% (2119/21038 patients) to 5.7% (1779/31097 patients). The estimated economic impact was 76,800 euros saved. An improvement in the pharmacotherapeutic process and a positive economic impact was observed, as well as enhancing patient safety and efficiency of the organization. Standardisation and professional training are future Lean Six Sigma candidate projects. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.

  10. A Methodology for Optimization in Multistage Industrial Processes: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Piotr Jarosz

    2015-01-01

    Full Text Available The paper introduces a methodology for optimization in multistage industrial processes with multiple quality criteria. Two ways of formulation of optimization problem and four different approaches to solve the problem are considered. Proposed methodologies were tested first on a virtual process described by benchmark functions and next were applied in optimization of multistage lead refining process.

  11. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  12. High risk process control system assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Venetia [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil); Zamberlan, Maria Cristina [National Institute of Tehnology (INT), Rio de Janeiro, RJ (Brazil). Human Reliability and Ergonomics Research Group for the Oil, Gas and Energy Sector

    2009-07-01

    The evolution of ergonomics methodology has become necessary due to the dynamics imposed by the work environment, by the increase of the need of human cooperation and by the high interaction between various sections within a company. In the last 25 years, as of studies made in the high risk process control, we have developed a methodology to evaluate these situations that focus on the assessment of activities and human cooperation, the assessment of context, the assessment of the impact of work of other sectors in the final activity of the operator, as well as the modeling of existing risks. (author)

  13. An Innovative Synthesis Methodology for Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip

    to improve a process. However, to date only a limited number have achieved implementation in industry, such as reactive distillation, dividing wall columns and reverse flow reactors. A reason for this is that the identification of the best PI option is neither simple nor systematic. That is to decide where......‐based solution approach. Starting from an analysis of existing processes, the methodology generates a set of PI process options. Subsequently, the initial search space is reduced through an ordered sequence of steps. As the search space decreases, more process details are added, increasing the complexity...

  14. Intelligent systems/software engineering methodology - A process to manage cost and risk

    Science.gov (United States)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  15. THE ASSESSMENT METHODOLOGY PDCA/PDSA – A METHODOLOGY FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Cristina Raluca POPESCU

    2015-07-01

    Full Text Available In the paper “The Assessment Methodology PDCA/PDSA – A Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodology PDCA/PDSA that is designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper, the authors present the general background concerning the performance of management business processes and the importance of achieving excellence and furthermore correctly assessing/evaluating it. In the second part of the paper (the assessment methodology PDCA/PDSA – as a methodology for coordinating the efforts to improve the organizational processes to achieve excellence, the authors describe the characteristics of the assessment methodology PDCA/PDSA from a theoretical point of view. We can say that in the current state of global economy, the global performance includes the economic, social and environmental issues, while, effectiveness and efficiency acquire new dimensions, both quantitative and qualitative. Performance needs to adopt a more holistic view of the interdependence of internal and external parameters, quantitative and qualitative, technical and human, physical and financial management of, thus leading to what we call today overall performance.

  16. MEDIATIC NARRATIVES AND IDENTIFICATION PROCESSES. A THEORETICAL AND METHODOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Salomé Sola Morales

    2013-04-01

    Full Text Available This article, theoretical and argumentative, lays the conceptual and methodological basis for the study of the link between identity and narrative media identification processes undertaken by individuals and groups. Thus, the setting national identifications, professional, religious or gender is here proposed as the result of the dialectic between the 'media narrative identity', which the media produce and convey, and identification processes that individuals and groups perform. Furthermore we propose the use of the biographical method as a form of empirical approach to psycho-social phenomenon

  17. New methodological perspectives in pedagogical approach to teaching and learning processes in school education

    Directory of Open Access Journals (Sweden)

    Lorena Pérez Piña

    2016-03-01

    Full Text Available For years, the methodology has been worked from a focus on teaching styles and pedagogy situations approach, which was a breakthrough, but the needs of individualization were drifting towards new methodological perspectives and teaching, being cooperative games, the willing environments, or communication processes (López, 2012, who were showing a new way to understand the methodology and teaching styles in Education, and it was not to focus on a way to understand these issues from the point of view of the teacher, but how attending to the cognitive and learning processes of students, we could better optimize our intervention.Actually, we have analyzed the latest publications in the last three years, around methodology and teaching styles, using various databases. The results of this search have been quite limited, which may make us think that they have not really relevant changes in the design of such education.

  18. Integrating rock mechanics issues with repository design through design process principles and methodology

    International Nuclear Information System (INIS)

    Bieniawski, Z.T.

    1996-01-01

    A good designer needs not only knowledge for designing (technical know-how that is used to generate alternative design solutions) but also must have knowledge about designing (appropriate principles and systematic methodology to follow). Concepts such as open-quotes design for manufactureclose quotes or open-quotes concurrent engineeringclose quotes are widely used in the industry. In the field of rock engineering, only limited attention has been paid to the design process because design of structures in rock masses presents unique challenges to the designers as a result of the uncertainties inherent in characterization of geologic media. However, a stage has now been reached where we are be able to sufficiently characterize rock masses for engineering purposes and identify the rock mechanics issues involved but are still lacking engineering design principles and methodology to maximize our design performance. This paper discusses the principles and methodology of the engineering design process directed to integrating site characterization activities with design, construction and performance of an underground repository. Using the latest information from the Yucca Mountain Project on geology, rock mechanics and starter tunnel design, the current lack of integration is pointed out and it is shown how rock mechanics issues can be effectively interwoven with repository design through a systematic design process methodology leading to improved repository performance. In essence, the design process is seen as the use of design principles within an integrating design methodology, leading to innovative problem solving. In particular, a new concept of open-quotes Design for Constructibility and Performanceclose quotes is introduced. This is discussed with respect to ten rock mechanics issues identified for repository design and performance

  19. The Behavior of Procurement Process as Described by Using System Dynamics Methodology

    OpenAIRE

    Mohd Yusoff, Mohd Izhan

    2018-01-01

    System dynamics methodology has been used in many fields of study which include supply chain, project management and performance, and procurement process. The said methodology enables the researchers to identify and study the impact of the variables or factors on the outcome of the model they developed. In this paper, we showed the use of system dynamics methodology in studying the behavior of procurement process that is totally different from those mentioned in previous studies. By using a t...

  20. METHODOLOGY FRAMEWORK FOR PROCESS INTEGRATION AND SERVICE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Darko Galinec

    2007-06-01

    Full Text Available History of information systems development was driven by business system's functions automation and mergers and acquisitions - business subjects integration into a whole. Modern business requires business processes integration through their dynamics and thus enterprise application integration (EAI as well. In this connection it is necessary to find ways and means of application integration and interaction in a consistent and reliable way. The real-time enterprise (RTE monitors, captures and analyzes root causes and overt events that are critical to its success the instant those events occur [6]. EAI is determined by business needs and business requirements. It must be based on business process repository and models, business integration methodology (BIM and information flow as well. Decisions concerning technology must be in function of successful application integration. In this paper EAI methodological framework and technological concepts for its achievements are introduced.

  1. Testing Methodology in the Student Learning Process

    Science.gov (United States)

    Gorbunova, Tatiana N.

    2017-01-01

    The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…

  2. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed...... a wide range of problems. In this paper, only the computer aided flowsheet design related features are presented....... together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...

  3. The Assessment Methodology RADAR – A Theoretical Approach of a Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence

    Directory of Open Access Journals (Sweden)

    Cristina Raluca Popescu

    2015-05-01

    Full Text Available In the paper “The Assessment Methodology RADAR – A Theoretical Approach of a Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodology RADAR that is designed to coordinate the efforts to improve the organizational processes in order to achieve excellence.

  4. Formation of a Methodological Approach to Evaluating the State of Management of Enterprise Flow Processes

    Directory of Open Access Journals (Sweden)

    Dzobko Iryna P.

    2016-02-01

    Full Text Available The formation of a methodological approach to evaluating management of the state of enterprise flow processes has been considered. Proceeding from the developed and presented in literary sources theoretical propositions on organization of management of enterprise flow processes, the hypothesis of the study is correlation of quantitative and qualitative evaluations of management effectiveness and formation of the integral index on their basis. The article presents stages of implementation of a methodological approach to evaluating the state of management of enterprise flow processes, which implies indicating the components, their characteristics and methods of research. The composition of indicators, on the basis of which it is possible to evaluate effectiveness of management of enterprise flow processes, has been determined. Grouping of such indicators based on the flow nature of enterprise processes has been performed. The grouping of indicators is justified by a pairwise determination of canonical correlations between the selected groups (the obtained high correlation coefficients confirmed the author’s systematization of indicators. It is shown that a specificity of the formation of a methodological approach to evaluating the state of management of enterprise flow processes requires expansion in the direction of aggregation of the results and determination of factors that influence effectiveness of flow processes management. The article carries out such aggregation using the factor analysis. Distribution of a set of objects into different classes according to the results of the cluster analysis has been presented. To obtain an integral estimation of effectiveness of flow processes management, the taxonomic index of a multidimensional object has been built. A peculiarity of the formed methodological approach to evaluating the state of management of enterprise flow processes is in the matrix correlation of integral indicators calculated on

  5. The trials methodological research agenda: results from a priority setting exercise

    Science.gov (United States)

    2014-01-01

    Background Research into the methods used in the design, conduct, analysis, and reporting of clinical trials is essential to ensure that effective methods are available and that clinical decisions made using results from trials are based on the best available evidence, which is reliable and robust. Methods An on-line Delphi survey of 48 UK Clinical Research Collaboration registered Clinical Trials Units (CTUs) was undertaken. During round one, CTU Directors were asked to identify important topics that require methodological research. During round two, their opinion about the level of importance of each topic was recorded, and during round three, they were asked to review the group’s average opinion and revise their previous opinion if appropriate. Direct reminders were sent to maximise the number of responses at each round. Results are summarised using descriptive methods. Results Forty one (85%) CTU Directors responded to at least one round of the Delphi process: 25 (52%) responded in round one, 32 (67%) responded in round two, 24 (50%) responded in round three. There were only 12 (25%) who responded to all three rounds and 18 (38%) who responded to both rounds two and three. Consensus was achieved amongst CTU Directors that the top three priorities for trials methodological research were ‘Research into methods to boost recruitment in trials’ (considered the highest priority), ‘Methods to minimise attrition’ and ‘Choosing appropriate outcomes to measure’. Fifty other topics were included in the list of priorities and consensus was reached that two topics, ‘Radiotherapy study designs’ and ‘Low carbon trials’, were not priorities. Conclusions This priority setting exercise has identified the research topics felt to be most important to the key stakeholder group of Directors of UKCRC registered CTUs. The use of robust methodology to identify these priorities will help ensure that this work informs the trials methodological research agenda, with

  6. THE QUALITY IMPROVEMENT OF PRIMER PACKAGING PROCESS USING SIX SIGMA METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Prima Ditahardiyani

    2008-01-01

    Full Text Available The implementation of Six Sigma has become a common theme in many organizations. This paper presents the Six Sigma methodology and its implementation in a primer packaging process of Cranberry drink. DMAIC (Define, Measure, Analyze, Improve and Control approach is used to analyze and to improve the primer packaging process, which have high variability and defects output. After the improvement, the results showed that there was an increasing sigma level. However, it is not significantly and has not achieved the world standard quality, yet. Therefore, the implementation of Six Sigma in primer packaging process of Cranberry drink still has a room for doing a further research.

  7. THE ASSESSMENT METHODOLOGIES PTELR, ADRI AND CAE – THREE METHODOLOGIES FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Cristina Raluca POPESCU

    2015-07-01

    Full Text Available In the paper “The Assessment Methodologies PTELR, ADRI and CAE – Three Methodologies for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodologies PTELR, ADRI and CAE that are designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper, the authors present the general background concerning the performance of management business processes and the importance of achieving excellence and furthermore correctly assessing/evaluating it. Aspects such as quality, quality control, quality assurance, performance and excellence are brought into discussion in the context generated by globalization, new technologies and new business models. Moreover, aspects regarding the methods employed to ensure the quality, maintaining it and continuous improvements, as well as total quality management, are also main pillars of this current research. In the content of the paper (the assessment methodologies PTELR, ADRI and CAE – as methodologies for coordinating the efforts to improve the organizational processes to achieve excellence, the authors describe the characteristics of the assessment methodologies PTELR, ADRI and CAE from a theoretical point of view.

  8. Integration of an iterative methodology for exergoeconomic improvement of thermal systems with a process simulator

    International Nuclear Information System (INIS)

    Vieira, Leonardo S.; Donatelli, Joao L.; Cruz, Manuel E.

    2004-01-01

    In this paper, we present the development and automated implementation of an iterative methodology for exergoeconomic improvement of thermal systems integrated with a process simulator, so as to be applicable to real, complex plants. The methodology combines recent available exergoeconomic techniques with new qualitative and quantitative criteria for the following tasks: (i) identification of decision variables that affect system total cost and exergetic efficiency; (ii) hierarchical classification of components; (iii) identification of predominant terms in the component total cost; and (iv) choice of main decision variables in the iterative process. To show the strengths and potential advantages of the proposed methodology, it is here applied to the benchmark CGAM cogeneration system. The results obtained are presented and discussed in detail and are compared to those reached using a mathematical optimization procedure

  9. UHF Signal Processing and Pattern Recognition of Partial Discharge in Gas-Insulated Switchgear Using Chromatic Methodology.

    Science.gov (United States)

    Wang, Xiaohua; Li, Xi; Rong, Mingzhe; Xie, Dingli; Ding, Dan; Wang, Zhixiang

    2017-01-18

    The ultra-high frequency (UHF) method is widely used in insulation condition assessment. However, UHF signal processing algorithms are complicated and the size of the result is large, which hinders extracting features and recognizing partial discharge (PD) patterns. This article investigated the chromatic methodology that is novel in PD detection. The principle of chromatic methodologies in color science are introduced. The chromatic processing represents UHF signals sparsely. The UHF signals obtained from PD experiments were processed using chromatic methodology and characterized by three parameters in chromatic space ( H , L , and S representing dominant wavelength, signal strength, and saturation, respectively). The features of the UHF signals were studied hierarchically. The results showed that the chromatic parameters were consistent with conventional frequency domain parameters. The global chromatic parameters can be used to distinguish UHF signals acquired by different sensors, and they reveal the propagation properties of the UHF signal in the L-shaped gas-insulated switchgear (GIS). Finally, typical PD defect patterns had been recognized by using novel chromatic parameters in an actual GIS tank and good performance of recognition was achieved.

  10. THE ASSESSMENT METHODOLOGIES PTELR, ADRI AND CAE – THREE METHODOLOGIES FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    OpenAIRE

    Cristina Raluca POPESCU; Gheorghe N. POPESCU

    2015-01-01

    In the paper “The Assessment Methodologies PTELR, ADRI and CAE – Three Methodologies for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodologies PTELR, ADRI and CAE that are designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper), the authors present the general background concer...

  11. An integrated methodology for characterizing flow and transport processes in fractured rock

    International Nuclear Information System (INIS)

    Wu, Yu-Shu

    2007-01-01

    To investigate the coupled processes involved in fluid and heat flow and chemical transport in the highly heterogeneous, unsaturated-zone (UZ) fractured rock of Yucca Mountain, we present an integrated modeling methodology. This approach integrates a wide variety of moisture, pneumatic, thermal, and geochemical isotopic field data into a comprehensive three-dimensional numerical model for modeling analyses. The results of field applications of the methodology show that moisture data, such as water potential and liquid saturation, are not sufficient to determine in situ percolation flux, whereas temperature and geochemical isotopic data provide better constraints to net infiltration rates and flow patterns. In addition, pneumatic data are found to be extremely valuable in estimating large-scale fracture permeability. The integration of hydrologic, pneumatic, temperature, and geochemical data into modeling analyses is thereby demonstrated to provide a practical modeling approach for characterizing flow and transport processes in complex fractured formations

  12. Energy Demand Modeling Methodology of Key State Transitions of Turning Processes

    Directory of Open Access Journals (Sweden)

    Shun Jia

    2017-04-01

    Full Text Available Energy demand modeling of machining processes is the foundation of energy optimization. Energy demand of machining state transition is integral to the energy requirements of the machining process. However, research focus on energy modeling of state transition is scarce. To fill this gap, an energy demand modeling methodology of key state transitions of the turning process is proposed. The establishment of an energy demand model of state transition could improve the accuracy of the energy model of the machining process, which also provides an accurate model and reliable data for energy optimization of the machining process. Finally, case studies were conducted on a CK6153i CNC lathe, the results demonstrating that predictive accuracy with the proposed method is generally above 90% for the state transition cases.

  13. Digital Methodology to implement the ECOUTER engagement process [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Rebecca C. Wilson

    2017-01-01

    Full Text Available ECOUTER (Employing COnceptual schema for policy and Translation E in Research – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  14. An elastic-plastic fracture mechanics based methodology to characterize cracking behavior and its application to environmental assisted processes

    International Nuclear Information System (INIS)

    Alvarez, J.A.; Gutierrez-Solana, F.

    1999-01-01

    Cracking processes suffered by new structural and piping steels when used in petroleum or other energy installations have demonstrated the need for a cracking resistance characterization methodology. This methodology, valid for both elastic and elastoplastic regimes, should be able to define crack propagation kinetics as a function of their controlling local parameters. This work summarizes an experimental and analytical methodology that has been shown to be suitable for characterizing cracking processes using compact tensile specimens, especially subcritical environmentally assisted ones, such as those induced by hydrogen in microalloyed steels. The applied and validated methodology has been shown to offer quantitative results of cracking behavior and to correlate these with the existing fracture micromechanisms. (orig.)

  15. A SystemC-Based Design Methodology for Digital Signal Processing Systems

    Directory of Open Access Journals (Sweden)

    Christian Haubelt

    2007-03-01

    Full Text Available Digital signal processing algorithms are of big importance in many embedded systems. Due to complexity reasons and due to the restrictions imposed on the implementations, new design methodologies are needed. In this paper, we present a SystemC-based solution supporting automatic design space exploration, automatic performance evaluation, as well as automatic system generation for mixed hardware/software solutions mapped onto FPGA-based platforms. Our proposed hardware/software codesign approach is based on a SystemC-based library called SysteMoC that permits the expression of different models of computation well known in the domain of digital signal processing. It combines the advantages of executability and analyzability of many important models of computation that can be expressed in SysteMoC. We will use the example of an MPEG-4 decoder throughout this paper to introduce our novel methodology. Results from a five-dimensional design space exploration and from automatically mapping parts of the MPEG-4 decoder onto a Xilinx FPGA platform will demonstrate the effectiveness of our approach.

  16. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    International Nuclear Information System (INIS)

    Barabas, Roberta de C.; Sabundjian, Gaiane

    2015-01-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  17. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    Energy Technology Data Exchange (ETDEWEB)

    Barabas, Roberta de C.; Sabundjian, Gaiane, E-mail: robertabarabas@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  18. A generic methodology for the design of sustainable carbon dioxide utilization processes using superstructure optimization

    DEFF Research Database (Denmark)

    Frauzem, Rebecca; Gani, Rafiqul

    , including as an extractive agent or raw material. Chemical conversion, an important element of utilization, involves the use of carbon dioxide as a reactant in the production of chemical compounds [2]. However, for feasible implementation, a systematic methodology is needed for the design of the utilization......, especially chemical conversion, processes. To achieve this, a generic methodology has been developed, which adopts a three-stage approach consisting in (i) process synthesis, (ii) process design, and (iii) innovative and sustainable design [3]. This methodology, with the individual steps and associated...... methods and tools, has been developed and applied to carbon dioxide utilization networks. This work will focus on the first stage, process synthesis, of this three-stage methodology; process synthesis is important in determining the appropriate processing route to produce products from a selection...

  19. Systematic screening methodology and energy efficient design of ionic liquid-based separation processes

    DEFF Research Database (Denmark)

    Kulajanpeng, Kusuma; Suriyapraphadilok, Uthaiporn; Gani, Rafiqul

    2016-01-01

    in size of the target solute was investigated using the same separation process and IL entrainer to obtain the same product purity. The proposed methodology has been evaluated through a case study of binary alcoholic aqueous azeotropic separation: water+ethanol and water+isopropanol.......A systematic methodology for the screening of ionic liquids (ILs) as entrainers and for the design of ILs-based separation processes in various homogeneous binary azeotropic mixtures has been developed. The methodology focuses on the homogeneous binary aqueous azeotropic systems (for example, water...

  20. Systematic substrate adoption methodology (SAM) for future flexible, generic pharmaceutical production processes

    DEFF Research Database (Denmark)

    Singh, Ravendra; Godfrey, Andy; Gregertsen, Björn

    2013-01-01

    (APIs) for early delivery campaigns. Of these candidates only a few will be successful such that further development is required to scale-up the process. Systematic computer-aided methods and tools are required for faster manufacturing of these API candidates. In this work, a substrate adoption...... methodology (SAM) for a series of substrates with similar molecular functionality has been developed. The objective is to achieve “flexible, fast and future” pharmaceutical production processes by adapting a generic modular process template. Application of the methodology is illustrated through a case study...

  1. Theory and methodology of social, political and economic processes risks determining in different countries of the world

    Directory of Open Access Journals (Sweden)

    Yashina Nadezhda, I.

    2015-06-01

    Full Text Available The study deals with the problems of the theory and methodology of social, political and economic processes risks in different countries with relative indicators of the socio-economic development level, as well as the size and condition of the public debt. Developed by the authors the methodology of determining the risks of social, political and economic processes of public policy around the world revealed close relationship between socio-economic situation of the countries and their public debt. Within the framework of this methodology two groups of factors characterizing the socio-political and economic processes in the country are being developed. After that each exponent and indicator are being processed, using expert procedures. Maximum statutory values for tentatively referenced countries with effective and ineffective government policies are identified. Then standardization (specification and definition of integral (generalized indexes of socio-political and economic processes in the country are taking place. After that the ranking of countries by aggregated standardized ratio is arranged, taking into account the significance of the developed indicators. The final phase of implementation methodology is identifying risks of social, political and economic processes of public policy around the world. This is the ranking of countries by ratio of stability in public policy (stability of economic and socio-political processes in the country. As the result of implementation methodology the following output was received: what really makes a difference is not the amount of the country's debt, but how effectively it manages this debt, whether it has a goal to improve social and economic indicators. Practical testing methodology has proven that studied indicators fully characterize the development of the countries, their political, social and economic situation on the world stage.

  2. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  3. Digital methodology to implement the ECOUTER engagement process [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Rebecca C. Wilson

    2016-06-01

    Full Text Available ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research – French for ‘to listen’ – is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  4. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  5. A dual response surface optimization methodology for achieving uniform coating thickness in powder coating process

    Directory of Open Access Journals (Sweden)

    Boby John

    2015-09-01

    Full Text Available The powder coating is an economic, technologically superior and environment friendly painting technique compared with other conventional painting methods. However large variation in coating thickness can reduce the attractiveness of powder coated products. The coating thickness variation can also adversely affect the surface appearance and corrosion resistivity of the product. This can eventually lead to customer dissatisfaction and loss of market share. In this paper, the author discusses a dual response surface optimization methodology to minimize the thickness variation around the target value of powder coated industrial enclosures. The industrial enclosures are cabinets used for mounting the electrical and electronic equipment. The proposed methodology consists of establishing the relationship between the coating thickness & the powder coating process parameters and developing models for the mean and variance of coating thickness. Then the powder coating process is optimized by minimizing the standard deviation of coating thickness subject to the constraint that the thickness mean would be very close to the target. The study resulted in achieving a coating thickness mean of 80.0199 microns for industrial enclosures, which is very close to the target value of 80 microns. A comparison of the results of the proposed approach with that of existing methodologies showed that the suggested method is equally good or even better than the existing methodologies. The result of the study is also validated with a new batch of industrial enclosures.

  6. Comparison of risk assessment methodologies for nuclear power and nuclear fuels processing plants

    International Nuclear Information System (INIS)

    Durant, W.S.; Walker, D.H.

    1986-08-01

    The utilization of nuclear fission for the generation of electric power or other purposes has as its by-product radioactive fission products. These radioactive fission products represent a potential hazard different in nature from that associated with other process operations or other methods of electrical power generation. As a result the electrical power stations and the facilities designed to process the irradiated fuel to recover the still useful fuel and the products of the irradiation are designed with multiple physical barriers to contain the radioactive fission products in the event that an accident were to occur. In recent years, a disciplined approach has evolved for developing detailed models of a facility and its processes. These models can be used to assess the response for the facility to upset or accident events. The approach is based on an ordered application of available data employing fault tree/event tree methodologies. Data and/or engineering judgment are applied in a probabilisitc framework so the approach has been called Probabilistic Risk Assessment (PRA). The approach has been applied to nuclear electric generating facilities and to nuclear fuel processing facilities to assess the potential for release of fission product and transuranium element radionuclides (the hazard) and the resulting risks. The application of the methodology to the electrical generating facilities and to the fuel processing facilities has evolved somewhat differently because of differences in the facilities, availability of failure rate data, and expected outputs. This paper summarizes the two approaches and the differences in them compares the risk results from the existing studies

  7. On process optimization considering LCA methodology.

    Science.gov (United States)

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Methodological treatment to the process of appreciation of the local architecture

    Directory of Open Access Journals (Sweden)

    Eida Aguiar Hernández

    2015-03-01

    Full Text Available In the article it is deepened about different methodological conceptions on the appreciation, particularly in architecture. It is given methodological treatment to the process of appreciation of this manifestation in the town like testimony of a certain time of the society, for that reason it is essential the appreciation of the most authentic values in the local art.

  9. Global health trials methodological research agenda:results from a priority setting exercise

    OpenAIRE

    Blazeby, Jane; Nasser, Mona; Soares-Weiser, Karla; Sydes, Matthew R.; Zhang, Junhua; Williamson, Paula R

    2018-01-01

    BackgroundMethodological research into the design, conduct, analysis and reporting of trials is essential to optimise the process. UK specialists in the field have established a set of top priorities in aid of this research. These priorities however may not be reflected in the needs of similar research in low to middle income countries (LMICs) with different healthcare provision, resources and research infrastructure. The aim of the study was to identify the top priorities for methodological ...

  10. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  11. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  12. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology.

    Science.gov (United States)

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta; Iavicoli, Sergio

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress.

  13. Do Italian Companies Manage Work-Related Stress Effectively? A Process Evaluation in Implementing the INAIL Methodology

    Science.gov (United States)

    Di Tecco, Cristina; Ronchetti, Matteo; Ghelli, Monica; Russo, Simone; Persechino, Benedetta

    2015-01-01

    Studies on Intervention Process Evaluation are attracting growing attention in the literature on interventions linked to stress and the wellbeing of workers. There is evidence that some elements relating to the process and content of an intervention may have a decisive role in implementing it by facilitating or hindering the effectiveness of the results. This study aimed to provide a process evaluation on interventions to assess and manage risks related to work-related stress using a methodological path offered by INAIL. The final sample is composed of 124 companies participating to an interview on aspects relating to each phase of the INAIL methodological path put in place to implement the intervention. INAIL methodology has been defined as useful in the process of assessing and managing the risks related to work-related stress. Some factors related to the process (e.g., implementation of a preliminary phase, workers' involvement, and use of external consultants) showed a role in significant differences that emerged in the levels of risk, particularly in relation to findings from the preliminary assessment. Main findings provide information on the key aspects of process and content that are useful in implementing an intervention for assessing and managing risks related to work-related stress. PMID:26504788

  14. General methodology for exergy balance in ProSimPlus® process simulator

    International Nuclear Information System (INIS)

    Ghannadzadeh, Ali; Thery-Hetreux, Raphaële; Baudouin, Olivier; Baudet, Philippe; Floquet, Pascal; Joulia, Xavier

    2012-01-01

    This paper presents a general methodology for exergy balance in chemical and thermal processes integrated in ProSimPlus ® as a well-adapted process simulator for energy efficiency analysis. In this work, as well as using the general expressions for heat and work streams, the whole exergy balance is presented within only one software in order to fully automate exergy analysis. In addition, after exergy balance, the essential elements such as source of irreversibility for exergy analysis are presented to help the user for modifications on either process or utility system. The applicability of the proposed methodology in ProSimPlus ® is shown through a simple scheme of Natural Gas Liquids (NGL) recovery process and its steam utility system. The methodology does not only provide the user with necessary exergetic criteria to pinpoint the source of exergy losses, it also helps the user to find the way to reduce the exergy losses. These features of the proposed exergy calculator make it preferable for its implementation in ProSimPlus ® to define the most realistic and profitable retrofit projects on the existing chemical and thermal plants. -- Highlights: ► A set of new expressions for calculation of exergy of material streams is developed. ► A general methodology for exergy balance in ProSimPlus ® is presented. ► A panel of solutions based on exergy analysis is provided to help the user for modifications on process flowsheets. ► The exergy efficiency is chosen as a variable in a bi-criteria optimization.

  15. Methodology for systematic analysis and improvement of manufacturing unit process life-cycle inventory (UPLCI)—CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 1: Methodology description

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    the provision of high-quality data for LCA studies of products using these unit process datasets for the manufacturing processes, as well as the in-depth analysis of individual manufacturing unit processes.In addition, the accruing availability of data for a range of similar machines (same process, different......This report proposes a life-cycle analysis (LCA)-oriented methodology for systematic inventory analysis of the use phase of manufacturing unit processes providing unit process datasets to be used in life-cycle inventory (LCI) databases and libraries. The methodology has been developed...... and resource efficiency improvements of the manufacturing unit process. To ensure optimal reproducibility and applicability, documentation guidelines for data and metadata are included in both approaches. Guidance on definition of functional unit and reference flow as well as on determination of system...

  16. A novel methodology for in-process monitoring of flow forming

    Science.gov (United States)

    Appleby, Andrew; Conway, Alastair; Ion, William

    2017-10-01

    Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.

  17. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanopa, Amornchai; Gani, Rafiqul

    2013-01-01

    A systematic design methodology is developed for producing multiple main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data........ Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  18. Diode laser absorption spectroscopy for process control: Sensor system design methodology

    International Nuclear Information System (INIS)

    Berzins, L.V.; Anklam, T.M.; Chambers, F.; Galanti, S.; Haynam, C.A.; Worden, E.F.

    1995-03-01

    A laser absorption spectroscopy (LAS) system has been developed at Lawrence Livermore National Laboratory (LLNL) for process control. LAS has proven itself to be an accurate and reliable method to monitor both density and composition. In this paper the important features and components of an industrial LAS diagnostic are described. Application of this approach to vaporization processes requires careful selection of the species and transitions to be monitored The relative vapor pressure, hyperfine structure, isotopic frequency shifts, and electronic temperature all effect the selection of a particular transition. In this paper we describe the methodology for choosing the optimal transition or transitions. Coevaporation of a titanium-niobium alloy is used to illustrate the methodology. In a related paper, T.M. Anklam et al describe the application of this diagnostic to monitoring and controlling composition in a physical vapor deposition process of industrial interest

  19. A systematic methodology for the design of continuous active pharmaceutical ingredient production processes

    DEFF Research Database (Denmark)

    Cervera Padrell, Albert Emili; Gani, Rafiqul; Kiil, Søren

    2011-01-01

    Continuous pharmaceutical manufacturing (CPM) has emerged as a powerful technology to obtain higher reaction yields and improved separation efficiencies, potentially leading to simplified process flowsheets, reduced total costs, lower environmental impacts, and safer and more flexible production...... and representation, as well as on how to employ this knowledge for process (re-)design. The aim of this paper is to introduce a methodology that systematically identifies already existing PSE methods and tools which can assist in the design of CPM processes. This methodology has been applied to a process...... for the production of an API developed by H. Lundbeck A/S, demonstrating the mentioned potential benefits that CPM can offer....

  20. Quick Green Scan: A Methodology for Improving Green Performance in Terms of Manufacturing Processes

    Directory of Open Access Journals (Sweden)

    Aldona Kluczek

    2017-01-01

    Full Text Available The heating sector has begun implementing technologies and practices to tackle the environmental and social–economic problems caused by their production process. The purpose of this paper is to develop a methodology, “the Quick-Green-Scan”, that caters for the need of quick assessment decision-makers to improve green manufacturing performance in companies that produce heating devices. The study uses a structured approach that integrates Life Cycle Assessment-based indicators, framework and linguistic scales (fuzzy numbers to evaluate the extent of greening of the enterprise. The evaluation criteria and indicators are closely related to the current state of technology, which can be improved. The proposed methodology has been created to answer the question whether a company acts on the opportunity to be green and whether these actions are contributing towards greening, maintaining the status quo or moving away from a green outcome. Results show that applying the proposed improvements in processes helps move the facility towards being a green enterprise. Moreover, the methodology, being particularly quick and simple, is a practical tool for benchmarking, not only in the heating industry, but also proves useful in providing comparisons for facility performance in other manufacturing sectors.

  1. Study of a methodology of identifying important research problems by the PIRT process

    International Nuclear Information System (INIS)

    Aoki, Takayuki; Takagi, Toshiyuki; Urayama, Ryoichi; Komura, Ichiro; Furukawa, Takashi; Yusa, Noritaka

    2014-01-01

    In this paper, we propose a new methodology of identifying important research problems to be solved to improve the performance of some specific scientific technologies by the phenomena identification and ranking table (PIRT) process which has been used as a methodology for demonstrating the validity of the best estimate simulation codes in US Nuclear Regulatory Commission (USNRC) licensing of nuclear power plants. The new methodology makes it possible to identify important factors affecting the performance of the technologies from the viewpoint of the figure of merit and problems associated with them while it keeps the fundamental concepts of the original PIRT process. Also in this paper, we demonstrate the effectiveness of the new methodology by applying it to a task of extracting research problems for improving an inspection accuracy of ultrasonic testing or eddy current testing in the inspection of objects having cracks due to fatigue or stress corrosion cracking. (author)

  2. Application of 'Process management' methodology in providing financial services of PE 'Post Serbia'

    Directory of Open Access Journals (Sweden)

    Kujačić Momčilo D.

    2014-01-01

    Full Text Available The paper describes application of the methodology 'Process management', in providing of financial services at the post office counter hall. An overview of the methodology is given, as one of the most commonly used qualitative methodology, whereby Process management's technics are described , those can better meet user needs and market demands, as well as to find more effectively way to resist current competition in the postal service market. One of the main problem that pointed out is a long waiting time in the counter hall during providing financial services, which leads to the formation of queue lines, and thus to customer dissatisfaction. According that, paper points steps that should be taken during provide of financial services in a postal network unit for providing services to customers by optimizing user time waiting in line and increasing the satisfaction of all participants in that process.

  3. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    2012-01-01

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  4. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Science.gov (United States)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  5. Using Lean Six Sigma Methodology to Improve a Mass Immunizations Process at the United States Naval Academy.

    Science.gov (United States)

    Ha, Chrysanthy; McCoy, Donald A; Taylor, Christopher B; Kirk, Kayla D; Fry, Robert S; Modi, Jitendrakumar R

    2016-06-01

    Lean Six Sigma (LSS) is a process improvement methodology developed in the manufacturing industry to increase process efficiency while maintaining product quality. The efficacy of LSS application to the health care setting has not been adequately studied. This article presents a quality improvement project at the U.S. Naval Academy that uses LSS to improve the mass immunizations process for Midshipmen during in-processing. The process was standardized to give all vaccinations at one station instead of giving a different vaccination at each station. After project implementation, the average immunizations lead time decreased by 79% and staffing decreased by 10%. The process was shown to be in control with a capability index of 1.18 and performance index of 1.10, resulting in a defect rate of 0.04%. This project demonstrates that the LSS methodology can be applied successfully to the health care setting to make sustainable process improvements if used correctly and completely. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.

  6. Agile Methodologies and Software Process Improvement Maturity Models, Current State of Practice in Small and Medium Enterprises

    OpenAIRE

    Koutsoumpos, Vasileios; Marinelarena, Iker

    2013-01-01

    Abstract—Background: Software Process Improvement (SPI) maturity models have been developed to assist organizations to enhance software quality. Agile methodologies are used to ensure productivity and quality of a software product. Amongst others they are applied in Small and Medium – sized Enterprises (SMEs). However, little is known about the combination of Agile methodologies and SPI maturity models regarding SMEs and the results that could emerge, as all the current SPI models are address...

  7. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  8. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  9. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  10. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  11. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  12. Contextual System of Symbol Structural Recognition based on an Object-Process Methodology

    OpenAIRE

    Delalandre, Mathieu

    2005-01-01

    We present in this paper a symbol recognition system for the graphic documents. This one is based on a contextual approach for symbol structural recognition exploiting an Object-Process Methodology. It uses a processing library composed of structural recognition processings and contextual evaluation processings. These processings allow our system to deal with the multi-representation of symbols. The different processings are controlled, in an automatic way, by an inference engine during the r...

  13. Process-oriented Design Methodology for the (Inter-) Organizational Intellectual Capital Management

    OpenAIRE

    Galeitzke, Mila; Oertwig, Nicole; Orth, Ronald; Kohl, Holger

    2016-01-01

    The development of a process-oriented design methodology for the visualization of intellectual capital in organisational business processes is described in this contribution. A tangible and intangible resource-oriented taxonomy in an integrated enterprise modelling environment is established. The comprehensive assessment, allocation and referencing of intellectual capital (human, structural and relational capital) counters the underutilization of available intellectual capital and allows for ...

  14. Google chemtrails: a methodology to analyze topic representation in search engine results

    OpenAIRE

    Ballatore, Andrea

    2015-01-01

    Search engine results influence the visibility of different viewpoints in political, cultural, and scientific debates. Treating search engines as editorial products with intrinsic biases can help understand the structure of information flows in new media. This paper outlines an empirical methodology to analyze the representation of topics in search engines, reducing the spatial and temporal biases in the results. As a case study, the methodology is applied to 15 popular conspiracy theories, e...

  15. Process synthesis for natural products from plants based on PAT methodology

    DEFF Research Database (Denmark)

    Malwade, Chandrakant Ramkrishna; Qu, Haiyan; Rong, Ben-Guang

    2017-01-01

    (QbD) approach, has been included at various steps to obtain molecular level information of process streams and thereby, support the rational decision making. The formulated methodology has been used to isolate and purify artemisinin, an antimalarial drug, from dried leaves of the plant Artemisia...... generates different process flowsheet alternatives consisting of multiple separation techniques. Decision making is supported by heuristics as well as basic process information already available from previous studies. In addition, process analytical technology (PAT) framework, a part of Quality by Design...

  16. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  17. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  18. Methodology of life cycle cost with risk expenditure for offshore process at conceptual design stage

    International Nuclear Information System (INIS)

    Nam, Kiil; Chang, Daejun; Chang, Kwangpil; Rhee, Taejin; Lee, In-Beum

    2011-01-01

    This study proposed a new LCC (life cycle cost) methodology with the risk expenditure taken into account for comparative evaluation of offshore process options at their conceptual design stage. The risk expenditure consisted of the failure risk expenditure and the accident risk expenditure. The former accounted for the production loss and the maintenance expense due to equipment failures while the latter reflected the asset damage and the fatality worth caused by disastrous accidents such as fire and explosion. It was demonstrated that the new LCC methodology was capable of playing the role of a process selection basis in choosing the best of the liquefaction process options including the power generation systems for a floating LNG (Liquefied natural gas) production facility. Without the risk expenditure, a simple economic comparison apparently favored the mixed refrigerant cycle which had the better efficiency. The new methodology with the risk expenditure, however, indicated that the nitrogen expansion cycle driven by steam turbines should be the optimum choice, mainly due to its better availability and safety. -- Highlights: → The study presented the methodology of the LCC with the risk expenditure for the conceptual design of offshore processes. → The proposed methodology demonstrated the applicability of the liquefaction unit with the power generation system of LNG FPSO. → Without the risk expenditure, a simple economic comparison apparently favored the mixed refrigerant cycle which had the better efficiency. → The new methodology indicated that the nitrogen expansion cycle driven by steam turbines is the optimum choice due to its better availability and safety.

  19. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  20. A Model-Based Methodology for Simultaneous Design and Control of a Bioethanol Production Process

    DEFF Research Database (Denmark)

    Alvarado-Morales, Merlin; Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan

    2010-01-01

    . The PGC methodology is used to generate more efficient separation designs in terms of energy consumption by targeting the separation task at the largest DF. Both methodologies are highlighted through the application of two case studies, a bioethanol production process and a succinic acid production...

  1. Thresholding: A Pixel-Level Image Processing Methodology Preprocessing Technique for an OCR System for the Brahmi Script

    Directory of Open Access Journals (Sweden)

    H. K. Anasuya Devi

    2006-12-01

    Full Text Available In this paper we study the methodology employed for preprocessing the archaeological images. We present the various algorithms used in the low-level processing stage of image analysis for Optical Character Recognition System for Brahmi Script. The image preprocessing technique covered in this paper is thresholding. We also try to analyze the results obtained by the pixel-level processing algorithms.

  2. Evaluation of Student Care Process in Urban and Rural Health Care Centers and Health House in Tabriz Using Tracer Methodology

    Directory of Open Access Journals (Sweden)

    Neda Kabiri

    2015-08-01

    Full Text Available Background and Objectives : Tracer methodology is a novel evaluation method which its purpose is to provide an accurate assessment of systems and processes for the delivery of care, treatment, and services at a health care organization. This study aimed to assess student care process in Tabriz using Tracer methodology. Material and Methods : This cross-sectional study was conducted in autumn 1391. Population study consisted of all the students who were covered by Tabriz health care center and study sample included an urban health care center, a rural health care center, a health house, and two schools in urban and rural areas which were selected by simple sampling method. Also, all the complicated and problematic processes were chosen to be assessed. Data were collected by interviewing, observing, and surveying documents and were compared with current standards. Results : The results of this study declared the percentage of points that each target group gained from tracer evaluation in student care process was 77% in health house, 90% in rural health care center and 83% in urban health care center. Findings indicated that documentation was the main weak point. Conclusion : According to the results of this study, student care process is sufficient; despite the fact that there are some deficiencies in caring process, as it may be improved through appropriate strategies. Furthermore, tracer methodology seems to be a proper method to evaluate various levels of health care system. ​

  3. Successful Technology Commercialization – Yes or No? Improving the Odds. The Quick Look Methodology and Process

    OpenAIRE

    Pletcher, Gary; Zehner II, William Bradley

    2017-01-01

    This article explores the relationships which transform new scientific knowledge into new commercial products, services, and ventures to create wealth creation. The major technology and marketing commercialization dilemmas are defined and addressed. The Quicklook methodology and related processes to quickly assess the commercial viability and potential of a scientific research project is explained. Using the Quicklook methodology and process early in the research and development process i...

  4. A methodology to simulate the cutting process for a nuclear dismantling simulation based on a digital manufacturing platform

    International Nuclear Information System (INIS)

    Hyun, Dongjun; Kim, Ikjune; Lee, Jonghwan; Kim, Geun-Ho; Jeong, Kwan-Seong; Choi, Byung Seon; Moon, Jeikwon

    2017-01-01

    Highlights: • Goal is to provide existing tech. with cutting function handling dismantling process. • Proposed tech. can handle various cutting situations in the dismantlement activities. • Proposed tech. can be implemented in existing graphical process simulation software. • Simulation results have demonstrated that the proposed technology achieves its goal. • Proposed tech. enlarges application of graphic simulation into dismantlement activity. - Abstract: This study proposes a methodology to simulate the cutting process in a digital manufacturing platform for the flexible planning of nuclear facility decommissioning. During the planning phase of decommissioning, visualization and verification using process simulation can be powerful tools for the flexible planning of the dismantling process of highly radioactive, large and complex nuclear facilities. However, existing research and commercial solutions are not sufficient for such a situation because complete segmented digital models for the dismantling objects such as the reactor vessel, internal assembly, and closure head must be prepared before the process simulation. The preparation work has significantly impeded the broad application of process simulation due to the complexity and workload. The methodology of process simulation proposed in this paper can flexibly handle various dismantling processes including repetitive object cuttings over heavy and complex structures using a digital manufacturing platform. The proposed methodology, which is applied to dismantling scenarios of a Korean nuclear power plant in this paper, is expected to reduce the complexity and workload of nuclear dismantling simulations.

  5. Optimization of vibratory welding process parameters using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Pravin Kumar; Kumar, S. Deepak; Patel, D.; Prasad, S. B. [National Institute of Technology Jamshedpur, Jharkhand (India)

    2017-05-15

    The current investigation was carried out to study the effect of vibratory welding technique on mechanical properties of 6 mm thick butt welded mild steel plates. A new concept of vibratory welding technique has been designed and developed which is capable to transfer vibrations, having resonance frequency of 300 Hz, into the molten weld pool before it solidifies during the Shielded metal arc welding (SMAW) process. The important process parameters of vibratory welding technique namely welding current, welding speed and frequency of the vibrations induced in molten weld pool were optimized using Taguchi’s analysis and Response surface methodology (RSM). The effect of process parameters on tensile strength and hardness were evaluated using optimization techniques. Applying RSM, the effect of vibratory welding parameters on tensile strength and hardness were obtained through two separate regression equations. Results showed that, the most influencing factor for the desired tensile strength and hardness is frequency at its resonance value, i.e. 300 Hz. The micro-hardness and microstructures of the vibratory welded joints were studied in detail and compared with those of conventional SMAW joints. Comparatively, uniform and fine grain structure has been found in vibratory welded joints.

  6. BUSINESS PROCESSES TRANSFORMATION IN THE METHODOLOGY OF MULTILEVEL FINANCIAL MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Andrey G. Mikheev

    2016-01-01

    Full Text Available The article discusses the application of process approach to financial management. The multilevel financial management methodology is described. It is based on the delegation of the financial management functions in the downstream division of the organization, process automation, transfer of financial resources between units of different levels of the hierarchical structure of the credit institution, the implementation of a financial management mechanism of the process approach, execution of business processes in the computer environment, using the strategic management of the organization by changing the coefficients, which are the parameters of decentralized control mechanism, using the construction of the «fast» financial indicators, taking into account the terms of financial resources transfer transactions and effective transformation of business processes. The article is focused on the credit institution business management through the application of process transformation to transfer funds business processes.

  7. PENGGUNAAN RESPONSE SURFACE METHODOLOGY UNTUK OPTIMASI PROSES DEKAFEINASI MENGGUNAKAN KITOSAN DARI KULIT UDANG [The Use of Response Surface Methodology in Decaffeination Process with Chitosan

    Directory of Open Access Journals (Sweden)

    Suhardi 1

    2002-04-01

    Full Text Available The objective of the present study was to determine the optimum condition of decaffeination process with chitosan in a model system using Response Surface Methodology. A 1000ppm caffeine solution was mixed with chitosan in varried concentrations, temperatures and process times. After filtration, caffeine in the filtrate was determined. The lower caffeine in the filtrate the more effective the decaffeination process. Result of the experiment showed that among chitosan concentrations of 50, 60, 70, 80, 90, and 100 mg per 100 ml caffeine solution, the concentration of 70mg was the most effective. Among temperatures applied of 28, 40, 60, 80, 90, and 100oC, the most effective was of 90oC. And among the process times of 15, 30, 60, and 90 minutes, 15 minutes was the most effective. Result of optimatization using RSM showed that the optimum condition of decaffeination process were concentration of chitosan of 69,52mg, temperature of 89,71oC, and process time of 14,88 minutes. Under this condition the process diminished 79,56% of caffeine from the model system.

  8. RESEARCH METHODOLOGY FUNDAMENTALS OF THE UKRAINIAN PROCESSING AND MANUFACTURING ENTERPRISES ECONOMIC POTENTIAL

    Directory of Open Access Journals (Sweden)

    Yurii Gudz

    2016-11-01

    Full Text Available The purpose of the paper is to find the most appropriate application ways for simulating of the business activities of the manufacturing and processing agriculture enterprises dealing in the corruptive Ukrainian environment and to overcome the fundamental methodology contradictions to be able to perform more accurate results of the economic potential assessment despite the sophisticated defects inherent in current industry sector. Methodology includes publication research, interviews and practical comparison of the published statistic data and real production volume, returns and other indicators to be able to estimate actual potential of the target enterprises. The paper comes through the classical analytical methods showing their application pros and contras in highly corruptive environment with the strong trend of data falsification. Results of the survey show the basic economic methods applicable for the research activity of processing and manufacturing enterprises operating in the field of agriculture. The authors’ experience picks up the problem of the urgent need of new methodology among vast abstractive researching executed by the majority of the scientists as they have some contradictions when we apply them for the real industry segment or even an enterprise. Corruption affecting the general statistic data misrepresents the facts therefore current (classic methods are not able to show real economic trends in the industrial segment. So the authors persist on the significance of the corruption distortion considering e.g. to identify the actual macro- and microeconomic indicators, indexes and ratios we involve the stage researching system of multidimensional comparative analysis to rank received rating and find appropriate position for enterprise and as we cannot ignore a constantly growing shadow sector of Ukrainian economy we perform economic potential assessment of the target enterprise with the identification of the shadow sector with

  9. 2015 Plan. Project 1: methodology and planning process of the Brazilian electric sector expansion

    International Nuclear Information System (INIS)

    1993-10-01

    The Planning Process of Brazilian Electric Sector Expansion, their normative aspects, instruments, main agents and the planning cycles are described. The methodology of expansion planning is shown, with the interactions of several study areas, electric power market and the used computer models. The forecasts of methodology evolution is also presented. (C.G.C.)

  10. Process design for isolation of soybean oil bodies by applying the product-driven process synthesis methodology

    NARCIS (Netherlands)

    Zderic, A.; Taraksci, T.; Hooshyar, N.; Zondervan, E.; Meuldijk, J.

    2014-01-01

    The present work describes the product driven process synthesis (PDPS) methodology for the conceptual design of extraction of intact oil bodies from soybeans. First, in this approach consumer needs are taken into account and based on these needs application of the final product (oil bodies) is

  11. The economics of climate change mitigation in developing countries - methodological and empirical results

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs.

  12. The economics of climate change mitigation in developing countries -methodological and empirical results

    International Nuclear Information System (INIS)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs

  13. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  14. A Comparative Analysis of Two Software Development Methodologies: Rational Unified Process and Extreme Programming

    Directory of Open Access Journals (Sweden)

    Marcelo Rafael Borth

    2014-01-01

    Full Text Available Software development methodologies were created to meet the great market demand for innovation, productivity, quality and performance. With the use of a methodology, it is possible to reduce the cost, the risk, the development time, and even increase the quality of the final product. This article compares two of these development methodologies: the Rational Unified Process and the Extreme Programming. The comparison shows the main differences and similarities between the two approaches, and highlights and comments some of their predominant features.

  15. Application of Decomposition Methodology to Solve Integrated Process Design and Controller Design Problems for Reactor-Separator-Recycle System

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    This paper presents the integrated process design and controller design (IPDC) for a reactor-separator-recycle (RSR) system and evaluates a decomposition methodology to solve the IPDC problem. Accordingly, the IPDC problem is solved by decomposing it into four hierarchical stages: (i) pre...... the design of a RSR system involving consecutive reactions, A B -> C and shown to provide effective solutions that satisfy design, control and cost criteria. The advantage of the proposed methodology is that it is systematic, makes use of thermodynamic-process knowledge and provides valuable insights......-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification. The methodology makes use of thermodynamic-process insights and the reverse design approach to arrive at the final process-controller design decisions. The developed methodology is illustrated through...

  16. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    International Nuclear Information System (INIS)

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared

  17. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    Science.gov (United States)

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  18. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    Science.gov (United States)

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  19. A new general methodology for incorporating physico-chemical transformations into multi-phase wastewater treatment process models.

    Science.gov (United States)

    Lizarralde, I; Fernández-Arévalo, T; Brouckaert, C; Vanrolleghem, P; Ikumi, D S; Ekama, G A; Ayesa, E; Grau, P

    2015-05-01

    This paper introduces a new general methodology for incorporating physico-chemical and chemical transformations into multi-phase wastewater treatment process models in a systematic and rigorous way under a Plant-Wide modelling (PWM) framework. The methodology presented in this paper requires the selection of the relevant biochemical, chemical and physico-chemical transformations taking place and the definition of the mass transport for the co-existing phases. As an example a mathematical model has been constructed to describe a system for biological COD, nitrogen and phosphorus removal, liquid-gas transfer, precipitation processes, and chemical reactions. The capability of the model has been tested by comparing simulated and experimental results for a nutrient removal system with sludge digestion. Finally, a scenario analysis has been undertaken to show the potential of the obtained mathematical model to study phosphorus recovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Intellectual technologies in the problems of thermal power engineering control: formalization of fuzzy information processing results using the artificial intelligence methodology

    Science.gov (United States)

    Krokhin, G.; Pestunov, A.

    2017-11-01

    Exploitation conditions of power stations in variable modes and related changes of their technical state actualized problems of creating models for decision-making and state recognition basing on diagnostics using the fuzzy logic for identification their state and managing recovering processes. There is no unified methodological approach for obtaining the relevant information is a case of fuzziness and inhomogeneity of the raw information about the equipment state. The existing methods for extracting knowledge are usually unable to provide the correspondence between of the aggregates model parameters and the actual object state. The switchover of the power engineering from the preventive repair to the one, which is implemented according to the actual technical state, increased the responsibility of those who estimate the volume and the duration of the work. It may lead to inadequacy of the diagnostics and the decision-making models if corresponding methodological preparations do not take fuzziness into account, because the nature of the state information is of this kind. In this paper, we introduce a new model which formalizes the equipment state using not only exact information, but fuzzy as well. This model is more adequate to the actual state, than traditional analogs, and may be used in order to increase the efficiency and the service period of the power installations.

  1. Process improvement methodologies uncover unexpected gaps in stroke care.

    Science.gov (United States)

    Kuner, Anthony D; Schemmel, Andrew J; Pooler, B Dustin; Yu, John-Paul J

    2018-01-01

    Background The diagnosis and treatment of acute stroke requires timed and coordinated effort across multiple clinical teams. Purpose To analyze the frequency and temporal distribution of emergent stroke evaluations (ESEs) to identify potential contributory workflow factors that may delay the initiation and subsequent evaluation of emergency department stroke patients. Material and Methods A total of 719 sentinel ESEs with concurrent neuroimaging were identified over a 22-month retrospective time period. Frequency data were tabulated and odds ratios calculated. Results Of all ESEs, 5% occur between 01:00 and 07:00. ESEs were most frequent during the late morning and early afternoon hours (10:00-14:00). Unexpectedly, there was a statistically significant decline in the frequency of ESEs that occur at the 14:00 time point. Conclusion Temporal analysis of ESEs in the emergency department allowed us to identify an unexpected decrease in ESEs and through process improvement methodologies (Lean and Six Sigma) and identify potential workflow elements contributing to this observation.

  2. Experimental validation on the effect of material geometries and processing methodology of Polyoxymethylene (POM)

    Science.gov (United States)

    Hafizzal, Y.; Nurulhuda, A.; Izman, S.; Khadir, AZA

    2017-08-01

    POM-copolymer bond breaking leads to change depending with respect to processing methodology and material geometries. This paper present the oversights effect on the material integrity due to different geometries and processing methodology. Thermo-analytical methods with reference were used to examine the degradation of thermomechanical while Thermogravimetric Analysis (TGA) was used to judge the thermal stability of sample from its major decomposition temperature. Differential Scanning Calorimetry (DSC) investigation performed to identify the thermal behaviour and thermal properties of materials. The result shown that plastic gear geometries with injection molding at higher tonnage machine more stable thermally rather than resin geometries. Injection plastic gear geometries at low tonnage machine faced major decomposition temperatures at 313.61°C, 305.76 °C and 307.91 °C while higher tonnage processing method are fully decomposed at 890°C, significantly higher compared to low tonnage condition and resin geometries specimen at 398°C. Chemical composition of plastic gear geometries with injection molding at higher and lower tonnage are compare based on their moisture and Volatile Organic Compound (VOC) content, polymeric material content and the absence of filler. Results of higher moisture and Volatile Organic Compound (VOC) content are report in resin geometries (0.120%) compared to higher tonnage of injection plastic gear geometries which is 1.264%. The higher tonnage of injection plastic gear geometry are less sensitive to thermo-mechanical degradation due to polymer chain length and molecular weight of material properties such as tensile strength, flexural strength, fatigue strength and creep resistance.

  3. Currently available methodologies for the processing of intravascular ultrasound and optical coherence tomography images.

    Science.gov (United States)

    Athanasiou, Lambros; Sakellarios, Antonis I; Bourantas, Christos V; Tsirka, Georgia; Siogkas, Panagiotis; Exarchos, Themis P; Naka, Katerina K; Michalis, Lampros K; Fotiadis, Dimitrios I

    2014-07-01

    Optical coherence tomography and intravascular ultrasound are the most widely used methodologies in clinical practice as they provide high resolution cross-sectional images that allow comprehensive visualization of the lumen and plaque morphology. Several methods have been developed in recent years to process the output of these imaging modalities, which allow fast, reliable and reproducible detection of the luminal borders and characterization of plaque composition. These methods have proven useful in the study of the atherosclerotic process as they have facilitated analysis of a vast amount of data. This review presents currently available intravascular ultrasound and optical coherence tomography processing methodologies for segmenting and characterizing the plaque area, highlighting their advantages and disadvantages, and discusses the future trends in intravascular imaging.

  4. Optimization of CO2 Laser Cutting Process using Taguchi and Dual Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    M. Madić

    2014-09-01

    Full Text Available Selection of optimal cutting parameter settings for obtaining high cut quality in CO2 laser cutting process is of great importance. Among various analytical and experimental optimization methods, the application of Taguchi and response surface methodology is one of most commonly used for laser cutting process optimization. Although the concept of dual response surface methodology for process optimization has been used with success, till date, no experimental study has been reported in the field of laser cutting. In this paper an approach for optimization of CO2 laser cutting process using Taguchi and dual response surface methodology is presented. The goal was to determine the near optimal laser cutting parameter values in order to ensure robust condition for minimization of average surface roughness. To obtain experimental database for development of response surface models, Taguchi’s L25 orthogonal array was implemented for experimental plan. Three cutting parameters, the cutting speed (3, 4, 5, 6, 7 m/min, the laser power (0.7, 0.9, 1.1, 1.3, 1.5 kW, and the assist gas pressure (3, 4, 5, 6, 7 bar, were used in the experiment. To obtain near optimal cutting parameters settings, multi-stage Monte Carlo simulation procedure was performed on the developed response surface models.

  5. An image-processing methodology for extracting bloodstain pattern features.

    Science.gov (United States)

    Arthur, Ravishka M; Humburg, Philomena J; Hoogenboom, Jerry; Baiker, Martin; Taylor, Michael C; de Bruin, Karla G

    2017-08-01

    There is a growing trend in forensic science to develop methods to make forensic pattern comparison tasks more objective. This has generally involved the application of suitable image-processing methods to provide numerical data for identification or comparison. This paper outlines a unique image-processing methodology that can be utilised by analysts to generate reliable pattern data that will assist them in forming objective conclusions about a pattern. A range of features were defined and extracted from a laboratory-generated impact spatter pattern. These features were based in part on bloodstain properties commonly used in the analysis of spatter bloodstain patterns. The values of these features were consistent with properties reported qualitatively for such patterns. The image-processing method developed shows considerable promise as a way to establish measurable discriminating pattern criteria that are lacking in current bloodstain pattern taxonomies. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. The Methodological Dynamism of Grounded Theory

    Directory of Open Access Journals (Sweden)

    Nicholas Ralph

    2015-11-01

    Full Text Available Variations in grounded theory (GT interpretation are the subject of ongoing debate. Divergences of opinion, genres, approaches, methodologies, and methods exist, resulting in disagreement on what GT methodology is and how it comes to be. From the postpositivism of Glaser and Strauss, to the symbolic interactionist roots of Strauss and Corbin, through to the constructivism of Charmaz, the field of GT methodology is distinctive in the sense that those using it offer new ontological, epistemological, and methodological perspectives at specific moments in time. We explore the unusual dynamism attached to GT’s underpinnings. Our view is that through a process of symbolic interactionism, in which generations of researchers interact with their context, moments are formed and philosophical perspectives are interpreted in a manner congruent with GT’s essential methods. We call this methodological dynamism, a process characterized by contextual awareness and moment formation, contemporaneous translation, generational methodology, and methodological consumerism.

  7. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden)], E-mail: elin.svensson@chalmers.se; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives.

  8. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty. A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, doi:10.1016/j.enpol.2008.10.023] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives. (author)

  9. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith

    2009-01-01

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO 2 emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO 2 emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives

  10. Application of a systematic methodology for sustainable carbon dioxide utilization process design

    DEFF Research Database (Denmark)

    Plaza, Cristina Calvera; Frauzem, Rebecca; Gani, Rafiqul

    than carbon capture and storage. To achieve this a methodology is developed to design sustainable carbon dioxide utilization processes. First, the information on the possible utilization alternatives is collected, including the economic potential of the process and the carbon dioxide emissions...... emission are desired in order to reduce the carbon dioxide emissions. Using this estimated preliminary evaluation, the top processes, with the most negative carbon dioxide emission are investigated by rigorous detailed simulation to evaluate the net carbon dioxide emissions. Once the base case design...

  11. Theoretical and methodological foundation of the process of students’ physical training of higher educational institutions

    Directory of Open Access Journals (Sweden)

    Pilipej L. P.

    2013-02-01

    Full Text Available Efficiency of the existent system of physical education is considered in the higher institutes of Ukraine. Information of unsatisfactory level of physical preparedness of university entrants and graduating students of higher institutes is resulted. The lacks of construction of process of physical education are shown on the basis of normatively-command approach. Absence of the programs, which take into account motivation and terms of activity of higher institutes, disparity the requirements of integration in the river-bed of the Bologna Process, is shown. The analysis of publications is resulted in accordance with the modern scientific paradigm of construction of the system of physical education of students on the basis of methodology of synergetics. Information of the questionnaire questioning is utillized in research. Cross-correlation connections are presented between elements of physical education systems, which influence on efficiency of process. The basic requirements of construction of process of physical education of students of institutes of higher are set.

  12. Application of Six Sigma Using DMAIC Methodology in the Process of Product Quality Control in Metallurgical Operation

    Directory of Open Access Journals (Sweden)

    Girmanová Lenka

    2017-12-01

    Full Text Available The Six Sigma DMAIC can be considered a guide for problem solving and product or process improvement. The majority of companies start to implement Six Sigma using the DMAIC methodology. The paper deals with application of Six Sigma using the DMAIC methodology in the process of product quality control. The case study is oriented on the field of metallurgical operations. The goal of the Six Sigma project was to ensure the required metallurgic product quality and to avoid an increase in internal costs associated with poor product quality. In this case study, a variety of tools and techniques like flow chart, histogram, Pareto diagram, analysis of FMEA (Failure Mode and Effect Analysis data, cause and effect diagram, logical analysis was used. The Sigma level has improved by approximately 13%. The achieved improvements have helped to reduce the quantity of defective products and the processing costs (technology for re-adjusting. Benefits resulting from the DMAIC implementation can be divided into three levels: the qualitative, economic and safety level.

  13. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  14. Optimization of Electrochemical Treatment Process Conditions for Distillery Effluent Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    P. Arulmathi

    2015-01-01

    Full Text Available Distillery industry is recognized as one of the most polluting industries in India with a large amount of annual effluent production. In this present study, the optimization of electrochemical treatment process variables was reported to treat the color and COD of distillery spent wash using Ti/Pt as an anode in a batch mode. Process variables such as pH, current density, electrolysis time, and electrolyte dose were selected as operation variables and chemical oxygen demand (COD and color removal efficiency were considered as response variable for optimization using response surface methodology. Indirect electrochemical-oxidation process variables were optimized using Box-Behnken response surface design (BBD. The results showed that electrochemical treatment process effectively removed the COD (89.5% and color (95.1% of the distillery industry spent wash under the optimum conditions: pH of 4.12, current density of 25.02 mA/cm2, electrolysis time of 103.27 min, and electrolyte (NaCl concentration of 1.67 g/L, respectively.

  15. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    Science.gov (United States)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  16. Digital processing methodology applied to exploring of radiological images; Metodologia de processamento digital aplicada a exploracao de imagens radiologicas

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Cristiane de Queiroz

    2004-07-01

    In this work, digital image processing is applied as a automatic computational method, aimed for exploring of radiological images. It was developed an automatic routine, from the segmentation and post-processing techniques to the radiology images acquired from an arrangement, consisting of a X-ray tube, target and filter of molybdenum, of 0.4 mm and 0.03 mm, respectively, and CCD detector. The efficiency of the methodology developed is showed in this work, through a case study, where internal injuries in mangoes are automatically detected and monitored. This methodology is a possible tool to be introduced in the post-harvest process in packing houses. A dichotomic test was applied to evaluate a efficiency of the method. The results show a success of 87.7% to correct diagnosis and 12.3% to failures to correct diagnosis with a sensibility of 93% and specificity of 80%. (author)

  17. Computationally based methodology for reengineering the high-level waste planning process at SRS

    International Nuclear Information System (INIS)

    Paul, P.K.; Gregory, M.V.; Wells, M.N.

    1997-01-01

    The Savannah River Site (SRS) has started processing its legacy of 34 million gallons of high-level radioactive waste into its final disposable form. The SRS high-level waste (HLW) complex consists of 51 waste storage tanks, 3 evaporators, 6 waste treatment operations, and 2 waste disposal facilities. It is estimated that processing wastes to clean up all tanks will take 30+ yr of operation. Integrating all the highly interactive facility operations through the entire life cycle in an optimal fashion-while meeting all the budgetary, regulatory, and operational constraints and priorities-is a complex and challenging planning task. The waste complex operating plan for the entire time span is periodically published as an SRS report. A computationally based integrated methodology has been developed that has streamlined the planning process while showing how to run the operations at economically and operationally optimal conditions. The integrated computational model replaced a host of disconnected spreadsheet calculations and the analysts' trial-and-error solutions using various scenario choices. This paper presents the important features of the integrated computational methodology and highlights the parameters that are core components of the planning process

  18. Application of Six Sigma methodology to a diagnostic imaging process.

    Science.gov (United States)

    Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M

    2012-01-01

    This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.

  19. METHODOLOGY OF AN ASSESSMENT OF RESULTS OF INTELLECTUAL ACTIVITY

    Directory of Open Access Journals (Sweden)

    Larisa I. Egorova

    2013-01-01

    Full Text Available The methodological bases of an assessment of the results of research, development and technological works and intangible assets are given in the article. A special attention is paid to the problems of formation of fair value of such assets for clients of financial accounting. The authors compare the methods of cost measurement of accounting entities regulated by Russian Accounting Standards (RAS and International Financial Reporting Standards (IFRS. Peculiarities of detection and acknowledgement of impairment of intangible assets (loss of value are considered.

  20. Substrates adoption methodology (SAM) to achieve “Fast, Flexible, Future (F3)” pharmaceutical production processes

    DEFF Research Database (Denmark)

    Singh, Ravendra; Rozada-Sanchez, Raquel; Wrate, Tim

    within the template. In this way the substrates adoption methodology helps to achieve “fast, flexible, future (F3)” pharmaceutical production processes by adapting a recently designed generic modular process-plant. The supporting tools for the substrate adoption are: (1) an ontological knowledge......There is a significant cost associated with process development of a portfolio of pharmaceutical products, few of which will reach the market. Continuous processing will increase the “chemical space” which can increase development efficiency. For example one, particularly attractive option...... is to develop manufacturing processes based on modular continuous systems; a flexible generic continuous modular plant which can be adapted for different substrates. In the work reported here, a substrates adoption methodology (SAM) has been developed. The proposed SAM identifies the necessary changes...

  1. Introduction and comparison of new EBSD post-processing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Stuart I.; Nowell, Matthew M.; Lindeman, Scott P. [EDAX, 392 East 12300 South, Suite H, Draper, UT 84020 (United States); Camus, Patrick P. [EDAX, 91 McKee Drive, Mahwah, NJ 07430 (United States); De Graef, Marc [Carnegie Mellon University, Department of Material Science and Engineering, 5000 Forbes Avenue, Pittsburgh, PA 15213 (United States); Jackson, Michael A. [BlueQuartz Software, 400 S. Pioneer Blvd, Springboro, OH 45066 (United States)

    2015-12-15

    Electron Backscatter Diffraction (EBSD) provides a useful means for characterizing microstructure. However, it can be difficult to obtain index-able diffraction patterns from some samples. This can lead to noisy maps reconstructed from the scan data. Various post-processing methodologies have been developed to improve the scan data generally based on correlating non-indexed or mis-indexed points with the orientations obtained at neighboring points in the scan grid. Two new approaches are introduced (1) a re-scanning approach using local pattern averaging and (2) using the multiple solutions obtained by the triplet indexing method. These methodologies are applied to samples with noise introduced into the patterns artificially and by the operational settings of the EBSD camera. They are also applied to a heavily deformed and a fine-grained sample. In all cases, both techniques provide an improvement in the resulting scan data, the local pattern averaging providing the most improvement of the two. However, the local pattern averaging is most helpful when the noise in the patterns is due to the camera operating conditions as opposed to inherent challenges in the sample itself. A byproduct of this study was insight into the validity of various indexing success rate metrics. A metric based given by the fraction of points with CI values greater than some tolerance value (0.1 in this case) was confirmed to provide an accurate assessment of the indexing success rate. - Highlights: • Re-indexing of saved EBSD patterns after neighbor pattern averaging can provide significant improvements on the indexing success rate particularly with noisy patterns. • Neighbor pattern averaging is most effective on patterns where noise is introduced by the camera operating conditions as opposed to inherent challenges presented by the sample itself. • Confidence Index based metrics are confirmed to generally provide accurate estimates of the indexing success rate albeit increasingly

  2. [Qualitative research methodology in health care].

    Science.gov (United States)

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  3. Development of Six Sigma methodology for CNC milling process improvements

    Science.gov (United States)

    Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab

    2017-10-01

    Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.

  4. Econometric Methodology of Monopolization Process Evaluation

    Directory of Open Access Journals (Sweden)

    Dmitrijs Skoruks

    2014-06-01

    Full Text Available The research “Econometric Methodology of Monopolization Process Evaluation” gives a perspective description of monopolization process’ nature, occurrence source, development procedure and internal conjuncture specifics, as well as providing an example of modern econometrical method application within a unified framework of market competition analysis for the purpose of conducting a quantitative competition evaluation on an industry level for practical use in both private and public sectors. The main question of the aforementioned research is the definition and quantitative analysis of monopolization effects in modern day globalized markets, while con- structing an empirical model of the econometric analysis, based on the use of in- ternational historical experience of monopoly formations standings, with the goal of introducing a further development scheme for the use of both econometrical and statistical instruments in line with the forecasting and business research need of enterprises and regulatory functions of the public sector. The current research uses a vast variety of monopolization evaluation ratios and their econometrical updates on companies that are involved in the study procedure in order to detect and scallar measure their market monopolizing potential, based on the implemented acquired market positions, turnover shares and competition policies.

  5. Digital Methodology to implement the ECOUTER engagement process

    OpenAIRE

    Wilson, Rebecca C.; Butters, Oliver W.; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J.

    2017-01-01

    ECOUTER (Employing COnceptUal schema for policy and Translation Engagement in Research) - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagem...

  6. Self-propelled prospection methodology: first results obtained in the northeast basin (Uruguay)

    International Nuclear Information System (INIS)

    Goso, H; Spoturno, J; Peciozzi, F.

    2008-01-01

    This report refers about uranium prospection methodology in Uruguay and its first results obtained in the northeast basin.The preliminary radiometric researching is carried out four works phases: material preparation, radiometric, statistics analysis, anomalies revision

  7. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  8. Study of a methodology of identifying important research problems by the PIRT process

    International Nuclear Information System (INIS)

    Aoki, Takayuki; Takagi, Toshiyuki; Urayama, Ryoichi; Komura, Ichiro; Furukawa, Takashi; Yusa, Noritaka

    2013-01-01

    In this paper, we propose a new methodology of identifying important research problems to be solved to improve the performance of some specific scientific technologies by the phenomena identification and ranking table (PIRT) process, which has been used as a methodology for demonstrating the validity of the best estimate simulation codes in USNRC licensing of nuclear power plants. It keeps the fundamental concepts of the original PIRT process but makes it possible to identify important factors affecting the performance of the technologies from the viewpoint of the figure of merit and problems associated with them, which need to be solved to improve the performance. Also in this paper, we demonstrate the effectiveness of the developed method by showing a specific example of the application to physical events or phenomena in objects having fatigue or SCC crack(s) under ultrasonic testing and eddy current testing. (author)

  9. Participatory evaluation of community actions as a learning methodology for personal and community empowerment: case studies and empowerment processes

    Directory of Open Access Journals (Sweden)

    Xavier Úcar Martínez

    2014-06-01

    Full Text Available Introduction: Participatory evaluation (PE is a hybrid methodology that can be used simultaneously to investigate and act in groups and communities. It can generate new knowledge about reality, but italso allows changes in the participants and their sociocultural context. This research project, developed over three years, aims to find out whether PE processes are useful and appropriate to evaluate community actionsand to generate learning that contribute to the empowerment of people who develop them.Method: The methodological structure of the research process design Participatory Evaluation processes that are applied in three selected communities-cases, over one year. The steering groups in each caseevaluated four dimensions of Community Development Plans: context, evolution, performance and results, using different techniques and group dynamics. Throughout this process, participants identify the acquiredknowledge and this is linked to indicators of empowerment, using questionnaires, content analysis and semi-structured interviews.Results: The development PE process in the three analyzed cases confirmed that PE is a useful strategy to assess participatory community actions of a territory; to report them to the people of the community; andto make shared decisions, about initiatives in order to improve community actions. The obtained results also verify that, throughout PE, there has been learning in the participants.Conclusions: The involvement of community members in the evaluation makes it more useful, fairer and more valid, but also a fourth positive consequence of PE is empowerment. From the process and the resultsof these cases of Participatory Evaluation, we consider that community EP is social transformation.

  10. On-line process failure diagnosis: The necessity and a comparative review of the methodologies

    International Nuclear Information System (INIS)

    Kim, I.S.

    1991-01-01

    Three basic approaches to process failure management are defined and discussed to elucidate the role of diagnosis in the operation of nuclear power plants. The rationale for the necessity of diagnosis is given from various perspectives. A comparative review of some representative diagnostic methodologies is presented and their shortcomings are discussed. Based on the insights from the review, the desirable characteristics from the review, the desirable characteristics of advanced diagnostic methodologies are derived from the viewpoints of failure detection, diagnosis, and correction. 11 refs

  11. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  12. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    Energy Technology Data Exchange (ETDEWEB)

    Boddu, S; Morrow, A; Krishnamurthy, N; McVicker, A; Deb, N; Rangaraj, D [Scott & White Hospital, Temple, TX (United States)

    2016-06-15

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality, undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.

  13. WE-H-BRC-04: Implement Lean Methodology to Make Our Current Process of CT Simulation to Treatment More Efficient

    International Nuclear Information System (INIS)

    Boddu, S; Morrow, A; Krishnamurthy, N; McVicker, A; Deb, N; Rangaraj, D

    2016-01-01

    Purpose: Our goal is to implement lean methodology to make our current process of CT simulation to treatment more efficient. Methods: In this study, we implemented lean methodology and tools and employed flowchart in excel for process-mapping. We formed a group of physicians, physicists, dosimetrists, therapists and a clinical physics assistant and huddled bi-weekly to map current value streams. We performed GEMBA walks and observed current processes from scheduling patient CT Simulations to treatment plan approval. From this, the entire workflow was categorized into processes, sub-processes, and tasks. For each process we gathered data on touch time, first time quality, undesirable effects (UDEs), and wait-times from relevant members of each task. UDEs were binned per frequency of their occurrence. We huddled to map future state and to find solutions to high frequency UDEs. We implemented visual controls, hard stops, and documented issues found during chart checks prior to treatment plan approval. Results: We have identified approximately 64 UDEs in our current workflow that could cause delays, re-work, compromise the quality and safety of patient treatments, or cause wait times between 1 – 6 days. While some UDEs are unavoidable, such as re-planning due to patient weight loss, eliminating avoidable UDEs is our goal. In 2015, we found 399 issues with patient treatment plans, of which 261, 95 and 43 were low, medium and high severity, respectively. We also mapped patient-specific QA processes for IMRT/Rapid Arc and SRS/SBRT, involving 10 and 18 steps, respectively. From these, 13 UDEs were found and 5 were addressed that solved 20% of issues. Conclusion: We have successfully implemented lean methodology and tools. We are further mapping treatment site specific workflows to identify bottlenecks, potential breakdowns and personnel allocation and employ tools like failure mode effects analysis to mitigate risk factors to make this process efficient.

  14. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  15. Hybrid response surface methodology-artificial neural network optimization of drying process of banana slices in a forced convective dryer.

    Science.gov (United States)

    Taheri-Garavand, Amin; Karimi, Fatemeh; Karimi, Mahmoud; Lotfi, Valiullah; Khoobbakht, Golmohammad

    2018-06-01

    The aim of the study is to fit models for predicting surfaces using the response surface methodology and the artificial neural network to optimize for obtaining the maximum acceptability using desirability functions methodology in a hot air drying process of banana slices. The drying air temperature, air velocity, and drying time were chosen as independent factors and moisture content, drying rate, energy efficiency, and exergy efficiency were dependent variables or responses in the mentioned drying process. A rotatable central composite design as an adequate method was used to develop models for the responses in the response surface methodology. Moreover, isoresponse contour plots were useful to predict the results by performing only a limited set of experiments. The optimum operating conditions obtained from the artificial neural network models were moisture content 0.14 g/g, drying rate 1.03 g water/g h, energy efficiency 0.61, and exergy efficiency 0.91, when the air temperature, air velocity, and drying time values were equal to -0.42 (74.2 ℃), 1.00 (1.50 m/s), and -0.17 (2.50 h) in the coded units, respectively.

  16. The Methodology of the Process of Formation of Innovation Management of Enterprises’ Development

    Directory of Open Access Journals (Sweden)

    Prokhorova Viktoriia V.

    2017-12-01

    Full Text Available The article is aimed at forming the methodology of process of innovation management of enterprises’ development in modern conditions. A study on formation of the essence of methodology was carried out, the stages of development of methods and means of scientific cognition were analyzed. The basic components of formation of methodology of innovation management of development of enterprises have been defined, i.e.: methods, types, principles, components, systematized aggregate. The relations of empirical and theoretical methods of scientific cognition were considered and defined. It has been determined that the increase of the volume and scope of scientific views, as well as the deepening of scientific knowledge in the disclosure of laws and regularities of functioning of real natural and social world, lead to the objective fact that is the desire of scientists to analyze methods and means by which modern innovative knowledge and views in the enterprise management system can be acquired and formed.

  17. School Psychology as a Relational Enterprise: The Role and Process of Qualitative Methodology

    Science.gov (United States)

    Newman, Daniel S.; Clare, Mary M.

    2016-01-01

    The purpose of this article is to explore the application of qualitative research to establishing a more complete understanding of relational processes inherent in school psychology practice. We identify the building blocks of rigorous qualitative research design through a conceptual overview of qualitative paradigms, methodologies, methods (i.e.,…

  18. Analysis of parameter and interaction between parameter of the microwave assisted transesterification process of coconut oil using response surface methodology

    Science.gov (United States)

    Hidayanti, Nur; Suryanto, A.; Qadariyah, L.; Prihatini, P.; Mahfud, Mahfud

    2015-12-01

    A simple batch process was designed for the transesterification of coconut oil to alkyl esters using microwave assisted method. The product with yield above 93.225% of alkyl ester is called the biodiesel fuel. Response surface methodology was used to design the experiment and obtain the maximum possible yield of biodiesel in the microwave-assisted reaction from coconut oil with KOH as the catalyst. The results showed that the time reaction and concentration of KOH catalyst have significant effects on yield of alkyl ester. Based on the response surface methodology using the selected operating conditions, the time of reaction and concentration of KOH catalyst in transesterification process were 150 second and 0.25%w/w, respectively. The largest predicted and experimental yield of alkyl esters (biodiesel) under the optimal conditions are 101.385% and 93.225%, respectively. Our findings confirmed the successful development of process for the transesterification reaction of coconut oil by microwave-assisted heating, which is effective and time-saving for alkyl ester production.

  19. PRINCIPLES OF RE-ENGINEERING METHODOLOGY FOR TECHNOLOGICAL PROCESS IN PROCESSING OF RAW MATERIAL COMPONENTS WHILE PRODUCING CEMENT AND SILICATE PRODUCTS

    Directory of Open Access Journals (Sweden)

    I. A. Busel

    2014-01-01

    Full Text Available Grinding process is characterized by high energy consumption and low productivity. Nowadays efficiency of the ball mills applied for grinding is rather low. Only 3-6 % of the supplied power energy is used for material grinding. The rest part of the energy disappears in the form of heat, vibration and noise. So matter concerning reduction of energy consumption is of great importance.Improvement of efficiency and quality of technological process in grinding of raw material components while producing construction materials is considered as one of priority-oriented targets of power- and resource saving in construction industry with the purpose to reduce energy consumption for grinding. Grinding efficiency at operating enterprises is reasonable to improve by modernization of the equipment and existing technological, management and other processes which are related to grinding of mineral raw material. In order to reduce grinding power consumption it is necessary to carry out a complex re-engineering of technological process in grinding of various materials which is based on usage of new modifications of grinding bodies, physical and chemical grinding aids, modern information technologies and industrial automation equipment. Application of modern information technologies and industrial automation equipment makes it possible to execute the grinding process with maximum achievable productivity for existing capacity due to automatic control and consideration of continuous changes in technological parameters. In addition to this such approach gives an opportunity to control processes in real time by immediate adjustments of technological equipment operational modes.The paper considers an approach to the development of re-engineering methodology for technological process in grinding of raw material components while producing construction materials. The present state of technological grinding process is presented in the paper. The paper points out the

  20. Formulation and development of a methodology for selecting desulfurization processes, applicable to diluted sulfurous emissions from copper. Preparation of the engineering for a draft project using electron beam process, selected with this methodology

    International Nuclear Information System (INIS)

    Aros M, Patricia.

    1997-01-01

    A comparative study of clean desulfurization technologies was prepared. Sulfur abatement processes from S O 2 gas streams were analyzed in 21 processes grouped into 8 different types. Since there are a large number of potentially applicable processes, this thesis presents a process selection methodology based on a technical/economic analysis series, which produces a ranking by scores. Visual Basic 3.0 software was used to develop the program, which can be installed in any computer and uses Windows 95. Based on these results in Chilean Nuclear Energy Commission decided to present a draft project for electron beam technology. The full design and calculation for the humidifying and cooling tower was prepared together with the design of the remaining equipment for size, in order to estimate probable costs. The pre-feasibility evaluation determined that the process would generate profits, when the selling price of ammonium sulfate - which is a byproduct of the process that is used as fertilizer - is above US$ 110/ton. The process cost is heavily influenced by the capital cost of storage facilities, since a long term supply for ammonia reagent is needed. This product is imported in Chile and it is currently an expensive reagent. (author). 33 app., 7 tabs

  1. A Comparison of the Safety Analysis Process and the Generation IV Proliferation Resistance/Physical Protection Assessment Methodology

    International Nuclear Information System (INIS)

    T. A. Bjornard; M. D. Zentner

    2006-01-01

    The Generation IV International Forum (GIF) is a vehicle for the cooperative international development of future nuclear energy systems. The Generation IV program has established primary objectives in the areas of sustainability, economics, safety and reliability, and Proliferation Resistance and Physical Protection (PR and PP). In order to help meet the latter objective a program was launched in December 2002 to develop a rigorous means to assess nuclear energy systems with respect to PR and PP. The study of Physical Protection of a facility is a relatively well established methodology, but an approach to evaluate the Proliferation Resistance of a nuclear fuel cycle is not. This paper will examine the Proliferation Resistance (PR) evaluation methodology being developed by the PR group, which is largely a new approach and compare it to generally accepted nuclear facility safety evaluation methodologies. Safety evaluation methods have been the subjects of decades of development and use. Further, safety design and analysis is fairly broadly understood, as well as being the subject of federally mandated procedures and requirements. It is therefore extremely instructive to compare and contrast the proposed new PR evaluation methodology process with that used in safety analysis. By so doing, instructive and useful conclusions can be derived from the comparison that will help to strengthen the PR methodological approach as it is developed further. From the comparison made in this paper it is evident that there are very strong parallels between the two processes. Most importantly, it is clear that the proliferation resistance aspects of nuclear energy systems are best considered beginning at the very outset of the design process. Only in this way can the designer identify and cost effectively incorporate intrinsic features that might be difficult to implement at some later stage. Also, just like safety, the process to implement proliferation resistance should be a dynamic

  2. Biodiesel production from crude cottonseed oil: an optimization process using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Xiaohu; Wang, Xi; Chen, Feng

    2011-07-01

    As the depletion of fossil resources continues, the demand for environmentally friendly sources of energy as biodiesel is increasing. Biodiesel is the resulting fatty acid methyl ester (FAME) from an esterification reaction. The use of cottonseed oil to produce biodiesel has been investigated in recent years, but it is difficult to find the optimal conditions of this process since multiple factors are involved. The aim of this study was to optimize the transesterification of cottonseed oil with methanol to produce biodiesel. A response surface methodology (RSM), an experimental method to seek optimal conditions for a multivariable system and reverse phase HPLC was used to analyze the conversion of triglyceride into biodiesel. RSM was successfully applied and the optimal condition was found with a 97% yield.

  3. APPLICATION OF AN EXPERIMENTAL METHODOLOGY IN THE OPTIMIZATION OF A TUNGSTEN CONCENTRATION PROCESS BY MICROEMULSIONS

    Directory of Open Access Journals (Sweden)

    A.C.S. RAMOS

    1997-06-01

    Full Text Available Abstract - In this work, we applied an experimental planning methodology in order to correlate the necessary amounts with the description of the a tungsten extraction process by microemulsions. The result is a mathematical modelling carried out using the Sheffe Net method, where the mixtures concentration values are represented inside an equilateral triangle. The tungsten concentration process occurs in two stages: extraction and reextraction. The extraction stage was determined by monitoring: phase relative volume (Vr, extraction percentage (%E and tungsten concentration in the microemulsion phase (Ctm e. The reextraction phase was determined by monitoring: reextraction percentage (%Re and tungsten concentration in the aqueous phase (Ctaq. Finally, we obtained equations that relate the extraction / reextraction properties to the composition of specific points inside the extraction region, obeying the error limits specified for the acceptance of each parameter. The results were evaluated through the construction of isoresponse diagrams and correlation graphics between experimental values and those obtained through use of equations.

  4. PM2 : a Process Mining Project Methodology

    NARCIS (Netherlands)

    Eck, van M.L.; Lu, X.; Leemans, S.J.J.; Aalst, van der W.M.P.; Zdravkovic, J.; Kirikova, M.; Johannesson, P.

    2015-01-01

    Process mining aims to transform event data recorded in information systems into knowledge of an organisation’s business processes. The results of process mining analysis can be used to improve process performance or compliance to rules and regulations. However, applying process mining in practice

  5. Experiments with a methodology to model the role of R and D expenditures in energy technology learning processes; first results

    International Nuclear Information System (INIS)

    Miketa, Asami; Schrattenholzer, Leo

    2004-01-01

    This paper presents the results of using a stylized optimization model of the global electricity supply system to analyze the optimal research and development (R and D) support for an energy technology. The model takes into account the dynamics of technological progress as described by a so-called two-factor learning curve (2FLC). The two factors are cumulative experience ('learning by doing') and accumulated knowledge ('learning by searching'); the formulation is a straightforward expansion of conventional one-factor learning curves, in which only cumulative experience is included as a factor, which aggregates the effects of accumulated knowledge and cumulative experience, among others. The responsiveness of technological progress to the two factors is quantified using learning parameters, which are estimated using empirical data. Sensitivities of the model results to the parameters are also tested. The model results also address the effect of competition between technologies and of CO 2 constraints. The results are mainly methodological; one of the most interesting is that, at least up to a point, competition between technologies - in terms of both market share and R and D support - need not lead to 'lock-in' or 'crowding-out'

  6. Experiments with a methodology to model the role of R and D expenditures in energy technology learning processes: first results

    International Nuclear Information System (INIS)

    Miketa, A.; Schrattenholzer, L.

    2004-01-01

    This paper presents the results of using a stylized optimization model of the global electricity supply system to analyze the optimal research and development (R and D) support for an energy technology. The model takes into account the dynamics of technological progress as described by a so-called two-factor learning curve (2FLC). The two factors are cumulative experience (''learning by doing'') and accumulated knowledge (''learning by searching''); the formulation is a straightforward expansion of conventional one-factor learning curves, in which only cumulative experience is included as a factor, which aggregates the effects of accumulated knowledge and cumulative experience, among others. The responsiveness of technological progress to the two factors is quantified using learning parameters, which are estimated using empirical data. Sensitivities of the model results to the parameters are also tested. The model results also address the effect of competition between technologies and of CO 2 constraints. The results are mainly methodological; one of the most interesting is that, at least up to a point, competition between technologies-in terms of both market share and R and D support-need not lead to ''lock-in'' or ''crowding-out''. (author)

  7. Integrated vehicle-based safety systems light-vehicle field operational test, methodology and results report.

    Science.gov (United States)

    2010-12-01

    "This document presents the methodology and results from the light-vehicle field operational test conducted as part of the Integrated Vehicle-Based Safety Systems program. These findings are the result of analyses performed by the University of Michi...

  8. A semi-quantitative reasoning methodology for filtering and ranking HAZOP results in HAZOPExpert

    International Nuclear Information System (INIS)

    Vaidhyanathan, Ramesh; Venkatasubramanian, Venkat

    1996-01-01

    Hazard and Operability (HAZOP) analysis is the most widely used and recognized as the preferred Process Hazards Analysis (PHA) approach in the chemical process industry. Recently, a diagraph-model based framework and an expert system called HAZOPExpert was developed for automating this analysis. Upon testing the performance of the system on various industrial case studies. HAZOPExpert was found to successfully mimic the human expert's reasoning and identify the hazards. But, with the increasing complexity of the processes, the HAZOPExpert system generated a large number of consequences compared to those identified by a team of experts. This is mainly due to the strict qualitative reasoning approach implemented in the HAZOPExpert system. In order to filter and rank the consequences generated by the HAZOPExpert system, a semi-quantitative reasoning methodology is proposed using additional quantitative knowledge in the form of design and operating specifications of the process units, and process material property values. This filtering approach combines the qualitative digraph-based HAZOP models and the quantitative knowledge to eliminate the unrealizable consequences. Significant reduction in the number of consequences was obtained using this approach on an ethylene process plant HAZOP case study

  9. Challenges and Opportunities for Harmonizing Research Methodology

    DEFF Research Database (Denmark)

    van Hees, V. T.; Thaler-Kall, K.; Wolf, K. H.

    2016-01-01

    Objectives: Raw accelerometry is increasingly being used in physical activity research, but diversity in sensor design, attachment and signal processing challenges the comparability of research results. Therefore, efforts are needed to harmonize the methodology. In this article we reflect on how...... increased methodological harmonization may be achieved. Methods: The authors of this work convened for a two-day workshop (March 2014) themed on methodological harmonization of raw accelerometry. The discussions at the workshop were used as a basis for this review. Results: Key stakeholders were identified...... as manufacturers, method developers, method users (application), publishers, and funders. To facilitate methodological harmonization in raw accelerometry the following action points were proposed: i) Manufacturers are encouraged to provide a detailed specification of their sensors, ii) Each fundamental step...

  10. Teaching methodology of the diagnosing process on the example of the fire alarm system

    Directory of Open Access Journals (Sweden)

    Paś Jacek

    2017-03-01

    Full Text Available The article presents a method of teaching the process of diagnosing the technical and functional condition of the fire alarm system (SSP. The fire alarm system’s laboratory model is a representation of a real fire alarm system. The lecturer has the opportunity to inflict several different independent damage. The aim of the laboratory exercise is to familiarize students with the methodology and structure of the fire alarm system diagnosing process.

  11. PETA: Methodology of Information Systems Security Penetration Testing

    Directory of Open Access Journals (Sweden)

    Tomáš Klíma

    2016-12-01

    Full Text Available Current methodologies of information systems penetration testing focuses mainly on a high level and technical description of the testing process. Unfortunately, there is no methodology focused primarily on the management of these tests. It often results in a situation when the tests are badly planned, managed and the vulnerabilities found are unsystematically remediated. The goal of this article is to present new methodology called PETA which is focused mainly on the management of penetration tests. Development of this methodology was based on the comparative analysis of current methodologies. New methodology incorporates current best practices of IT governance and project management represented by COBIT and PRINCE2 principles. Presented methodology has been quantitatively evaluated.

  12. Safety on Judo Children: Methodology and Results

    OpenAIRE

    Sacripanti, Attilio; De Blasis, Tania

    2017-01-01

    Many doctors although they have not firsthand experience of judo, describe it as a sport unsuitable for children. Theoretically speaking falls derived by Judo throwing techniques,could be potentially dangerous,especially for kids,if poorly managed.A lot of researches were focalized on trauma or injuries taking place in judo, both during training and competition The goal of this Research is to define and apply a scientific methodology to evaluate the hazard in falls by judo throws for children...

  13. SR-Site groundwater flow modelling methodology, setup and results

    International Nuclear Information System (INIS)

    Selroos, Jan-Olof; Follin, Sven

    2010-12-01

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report

  14. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))

    2010-12-15

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  15. Analytical methodology for optimization of waste management scenarios in nuclear installation decommissioning process - 16148

    International Nuclear Information System (INIS)

    Zachar, Matej; Necas, Vladimir; Daniska, Vladimir; Rehak, Ivan; Vasko, Marek

    2009-01-01

    The nuclear installation decommissioning process is characterized by production of large amount of various radioactive and non-radioactive waste that has to be managed, taking into account its physical, chemical, toxic and radiological properties. Waste management is considered to be one of the key issues within the frame of the decommissioning process. During the decommissioning planning period, the scenarios covering possible routes of materials release into the environment and radioactive waste disposal, should be discussed and evaluated. Unconditional and conditional release to the environment, long-term storage at the nuclear site, near surface or deep geological disposal and relevant material management techniques for achieving the final status should be taken into account in the analysed scenarios. At the level of the final decommissioning plan, it is desirable to have the waste management scenario optimized for local specific facility conditions taking into account a national decommissioning background. The analytical methodology for the evaluation of decommissioning waste management scenarios, presented in the paper, is based on the materials and radioactivity flow modelling, which starts from waste generation activities like pre-dismantling decontamination, selected methods of dismantling, waste treatment and conditioning, up to materials release or conditioned radioactive waste disposal. The necessary input data for scenarios, e.g. nuclear installation inventory database (physical and radiological data), waste processing technologies parameters or material release and waste disposal limits, have to be considered. The analytical methodology principles are implemented into the standardised decommissioning parameters calculation code OMEGA, developed in the DECOM company. In the paper the examples of the methodology implementation for the scenarios optimization are presented and discussed. (authors)

  16. A symbolic methodology to improve disassembly process design.

    Science.gov (United States)

    Rios, Pedro; Blyler, Leslie; Tieman, Lisa; Stuart, Julie Ann; Grant, Ed

    2003-12-01

    Millions of end-of-life electronic components are retired annually due to the proliferation of new models and their rapid obsolescence. The recovery of resources such as plastics from these goods requires their disassembly. The time required for each disassembly and its associated cost is defined by the operator's familiarity with the product design and its complexity. Since model proliferation serves to complicate an operator's learning curve, it is worthwhile to investigate the benefits to be gained in a disassembly operator's preplanning process. Effective disassembly process design demands the application of green engineering principles, such as those developed by Anastas and Zimmerman (Environ. Sci. Technol. 2003, 37, 94A-101A), which include regard for product complexity, structural commonality, separation energy, material value, and waste prevention. This paper introduces the concept of design symbolsto help the operator more efficiently survey product complexity with respect to location and number of fasteners to remove a structure that is common to all electronics: the housing. With a sample of 71 different computers, printers, and monitors, we demonstrate that appropriate symbols reduce the total disassembly planning time by 13.2 min. Such an improvement could well make efficient the separation of plastic that would otherwise be destined for waste-to-energy or landfill. The symbolic methodology presented may also improve Design for Recycling and Design for Maintenance and Support.

  17. Sewage treatment processes: The methodology for the resort communities; Tecnologias de Depuracion: la metodologia de seleccion para poblaciones turisticas

    Energy Technology Data Exchange (ETDEWEB)

    Nieves de la Vega, G.; Kovacs, Z. [AQUA/PLAN, S.A. (Spain)

    1995-06-01

    The selection of adequate sewage treatment processes for resort communities has to be based upon a detailed knowledge of the characteristics of sewerage discharges. In order to define a methodology, the most representative variables such as climatology, seasonal variation, required treatment efficiency, sewage characteristics and availability of land, are identified. A wide range of available treatment processes is defined and the relationship between variables and priority criteria is analysed. Finally, a decision-diagram allowing the selection of the most adequate treatment process in each particular case is presented. The methodology is applied to mountain resort communities. (Author)

  18. A Methodology for Virtual Enterprise Management – Results from IMS 95001/Esprit 26509 Globeman21 project

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan; Pedersen, Jens Dahl

    2000-01-01

    This paper presents results of the recently concluded IMS Globeman21 (Global Manufacturing in the 21st Century, IMS 95001/ESPRIT 26509) project. The results is the Virtual Enterprise Concept, a Virtual Enterprise Framework based upon GERAM (ISO/DIS 15704) and a Methodology for Virtual Enterprise...... management and engineering which is related to the Virtual Enterprise Framework (VEF). Applying the VEF as a basis structure for the Methodology provides a systematic approach for preparation and engineering of virtual enterprises. The Virtual Enterprise Framework opens up the prospect of combining...... experiences in the area of virtual enterprise into an integrated whole enabling researchers or practitioners to focus on subsets of the challenge related to the realisation of the virtual enterprise potentials. In this paper the identification, concept and requirements phases of the methodology are briefly...

  19. Historical survey of the qualifying process of Furnas calculus methodology in the areas of rods, neutronics, thermohydraulic accidents and transients

    International Nuclear Information System (INIS)

    Conti, C.F.S.; Silva Galetti, M.R. da.

    1990-02-01

    As Furnas intends to assume in the future the responsibility of performing Safety Analyses associated to Reload and Operation questions to Angra 1, it was figured out the necessity of qualifying its methodology by CNEN. The Methodology Qualification Process is based on guidelines proposed by CNEN at NT-DR-N o 02/87, where it was divided in four steps. This Technical Note aims to present the follow up of FURNAS Methodology Qualification Process and to bring it up to date in the areas of Core Physics (Neutronics), Core Thermal-Hydraulics, Fuel Rod Behaviour, Transient and Large Break Loss of Coolant Accident Analyses (LBLOCA). (author)

  20. Photothermal heating as a methodology for post processing of polymeric nanofibers

    Science.gov (United States)

    Gorga, Russell; Clarke, Laura; Bochinski, Jason; Viswanath, Vidya; Maity, Somsubhra; Dong, Ju; Firestone, Gabriel

    2015-03-01

    Metal nanoparticles embedded within polymeric systems can be made to act as localized heat sources thereby aiding in-situ polymer processing. This is made possible by the surface plasmon resonance (SPR) mediated photothermal effect of metal (in this case gold) nanoparticles, wherein incident light absorbed by the nanoparticle generates a non-equilibrium electron distribution which subsequently transfers this energy into the surrounding medium, resulting in a temperature increase in the immediate region around the particle. Here we demonstrate this effect in polymer nanocomposite systems, specifically electrospun polyethylene oxide nanofibrous mats, which have been annealed at temperatures above the glass transition. A non-contact temperature measurement technique utilizing embedded fluorophores (perylene) has been used to monitor the average temperature within samples. The effect of annealing methods (conventional and photothermal) and annealing conditions (temperature and time) on the fiber morphology, overall crystallinity, and mechanical properties is discussed. This methodology is further utilized in core-sheath nanofibers to crosslink the core material, which is a pre-cured epoxy thermoset. NSF Grant CMMI-1069108.

  1. Process and results of the development of an ICNP® Catalogue for Cancer Pain

    Directory of Open Access Journals (Sweden)

    Marisaulina Wanderley Abrantes de Carvalho

    2013-10-01

    Full Text Available This was a methodological study conducted to describe the process and results of the development of an International Classification for Nursing Practice (ICNP® Catalogue for Cancer Pain. According to the International Council of Nurses (ICN, this catalogue contains a subset of nursing diagnoses, outcomes, and interventions to document the implementation of the nursing process in cancer patients. This catalogue was developed in several steps according to the guidelines recommended by the ICN. As a result, 68 statements on nursing diagnoses/outcomes were obtained, which were classified according to the theoretical model for nursing care related to cancer pain into physical (28, psychological (29, and sociocultural and spiritual (11 aspects. A total of 116 corresponding nursing interventions were obtained. The proposed ICNP® Catalogue for Cancer Pain aims to provide safe and systematic orientation to nurses who work in this field, thus improving the quality of patient care and facilitating the performance of the nursing process.

  2. Methodology for optimization of process integration schemes in a biorefinery under uncertainty

    International Nuclear Information System (INIS)

    Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >González-Cortés, Meilyn; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Martínez-Martínez, Yenisleidys; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Albernas-Carvajal, Yailet; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Pedraza-Garciga, Julio; Marta Abreu de las Villas (Cuba))" data-affiliation=" (Departamento de Ingeniería Química. Facultad de Química y Farmacia. Universidad Central Marta Abreu de las Villas (Cuba))" >Morales-Zamora, Marlen

    2017-01-01

    The uncertainty has a great impact in the investment decisions, operability of the plants and in the feasibility of integration opportunities in the chemical processes. This paper, presents the steps to consider the optimization of process investment in the processes integration under conditions of uncertainty. It is shown the potentialities of the biomass cane of sugar for the integration with several plants in a biorefinery scheme for the obtaining chemical products, thermal and electric energy. Among the factories with potentialities for this integration are the pulp and paper and sugar factories and other derivative processes. Theses factories have common resources and also have a variety of products that can be exchange between them so certain products generated in a one of them can be raw matter in another plant. The methodology developed guide to obtaining of feasible investment projects under uncertainty. As objective function was considered the maximization of net profitable value in different scenarios that are generated from the integration scheme. (author)

  3. Optimization of process variables on flexural properties of epoxy/organo-montmorillonite nanocomposite by response surface methodology

    Directory of Open Access Journals (Sweden)

    2008-01-01

    Full Text Available This study attempted to investigate the preparation and optimization of the flexural properties for epoxy/organomontmorillonite (OMMT nanocomposites. In-situ polymerization method was used to prepare epoxy/OMMT nanocomposites. The diglycidyl ether bisphenol A (DGEBA and curing agent were mixed first, followed by the addition of OMMT. In this study, computer aided statistical methods of experimental design (Response Surface Methodology, RSM was used to investigate the process variables on the flexural properties of epoxy/4wt% OMMT nanocomposites. Speed of mechanical stirrer, post-curing time and post-curing temperature were chosen as process variables in the experimental design. Results showed that the speed of mechanical stirrer, post-curing time and post-curing temperature were able to influence the flexural modulus and flexural yield stress of epoxy/4 wt% OMMT nanocomposites. The results of optimization showed that the design of experiment (DOE has six combination of operating variables which have been obtained in order to attain the greatest overall desirability.

  4. Lean methodology for performance improvement in the trauma discharge process.

    Science.gov (United States)

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  5. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    Science.gov (United States)

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  6. Optimization of Tape Winding Process Parameters to Enhance the Performance of Solid Rocket Nozzle Throat Back Up Liners using Taguchi's Robust Design Methodology

    Science.gov (United States)

    Nath, Nayani Kishore

    2017-08-01

    The throat back up liners is used to protect the nozzle structural members from the severe thermal environment in solid rocket nozzles. The throat back up liners is made with E-glass phenolic prepregs by tape winding process. The objective of this work is to demonstrate the optimization of process parameters of tape winding process to achieve better insulative resistance using Taguchi's robust design methodology. In this method four control factors machine speed, roller pressure, tape tension, tape temperature that were investigated for the tape winding process. The presented work was to study the cogency and acceptability of Taguchi's methodology in manufacturing of throat back up liners. The quality characteristic identified was Back wall temperature. Experiments carried out using L 9 ' (34) orthogonal array with three levels of four different control factors. The test results were analyzed using smaller the better criteria for Signal to Noise ratio in order to optimize the process. The experimental results were analyzed conformed and successfully used to achieve the minimum back wall temperature of the throat back up liners. The enhancement in performance of the throat back up liners was observed by carrying out the oxy-acetylene tests. The influence of back wall temperature on the performance of throat back up liners was verified by ground firing test.

  7. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    Science.gov (United States)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  8. Application of the RES methodology for identifying features, events and processes (FEPs) for near-field analysis of copper-steel canister

    International Nuclear Information System (INIS)

    Vieno, T.; Hautojaervi, A.; Raiko, H.; Ahonen, L.; Salo, J.P.

    1994-12-01

    Rock Engineering Systems (RES) is an approach to discover the important characteristics and interactions of a complex problem. Recently RES has been applied to identify features, events and processes (FEPs) for performance analysis of nuclear waste repositories. The RES methodology was applied to identify FEPs for the near-field analysis of the copper-steel canister for spent fuel disposal. The aims of the exercise were to learn and test the RES methodology and, secondly, to find out how much the results differ when RES is applied by two different groups on the same problem. A similar exercise was previously carried out by a SKB group. A total of 90 potentially significant FEPs were identified. The exercise showed that the RES methodology is a practicable tool to get a comprehensive and transparent picture of a complex problem. The approach is easy to learn and use. It reveals the important characteristics and interactions and organizes them in a format easy to understand. (9 refs., 5 figs., 3 tabs.)

  9. APPLICATION OF THE CP METHODOLOGY IN REDUCTION OF WASTE IN THE PROCESSING OF TOBACCO COMPANIES

    Directory of Open Access Journals (Sweden)

    André Luiz Emmel Silva

    2015-01-01

    Full Text Available The production, marketing and processing of tobacco are the base of the municipalities of Vale do Rio Pardo / RS economy. Although it is the raw material for various products, this region is intended almost exclusively for the production of cigarettes. Dominated by a few large multinational, this market moves this imposing financial values, where tobacco is much of the cost of production. Thus, this paper seeks to prove the efficiency of the methodology application Cleaner Production (CP in tobacco waste reduction within the tobacco processing and cigarette manufacturing companies. This analysis was conducted as a case study, carrying out visits to the knowledge production process, identifying the points of waste, taking measurements and developing a set of measures to be taken to minimize these losses. The Cleaner Production method was chosen because it is a relatively new concept and it has shown good results in companies where it is located. Through the measurements, the main points of breaks were identified and then an analysis was performed by applying the concepts of CP, and a set of measures has been proposed to reduce losses. As a result, it was achieved a reduction of 83% in the rate of tobacco waste in the production process. It was concluded that the CP, within the tobacco processing industry, was efficient, impacting directly on production costs, rationalizing the use of raw materials and reducing the total volume of waste generated.

  10. Methodological aspect of research of the process of socialization in media-cultural space of information society

    Directory of Open Access Journals (Sweden)

    N. Y. Hirlina

    2016-03-01

    Full Text Available Integrated within the social and philosophical discourse interdisciplinary methodology, based on the classic philosophical methodology for the analysis of socio-cultural phenomena enables a holistic understanding of the studied phenomenon. From a methodological point of view it is important to determine the social and philosophical understanding of the impact medіa cultural space of personality in conditions of dynamically changing socio-cultural environment. important social and philosophical methodological guideline should be considered on a thesis constant presence in the media culture of human space as being due to the fact that man is a social being, and the information society without media culture as its attribute exists. Philosophical «core» study of the spiritual culture of youth is humanism in its broadest sense, that is, understanding of the studied phenomenon primarily as a multi-dimensional culturing of human values. Submission materialistic determinant factors medіa cultural spiritual space is only possible under the dominance of humanistic values. With all the variety to understanding the spiritual dimension of the relationship of the individual with the socio-cultural environment common dominant philosophical idea of guidelines is the recognition of the spiritual and cultural autonomy rights. Globalization and its associated civilization and processes are seen as foreign in relation to social rights, while the internal spiritual content is cultural processes. Anthropological oriented cultural space of socialization based on interpersonal cultural interaction that produces unique and distinctive personality.

  11. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  12. Development of methodology for the fore cast of microbiological processes under transaction to industrial cultivation

    International Nuclear Information System (INIS)

    Lepeshkin, G.; Bugreev, V.

    1996-01-01

    Proposals for possible cooperation with Western partners : To obtain the scale transfers method in laboratory condition of microorganisms cultivation to industrial conditions based on the parameters of spatial cultivation to industrial conditions based on the parameters of spatial heterogeneous hydrodynamics situation in bioreactors. The problem is the impossibility to count constructive elements and regimes of ferments operation which provided optimum environment for microorganisms vital functions because the hydrodynamic, biological and mass change processes are complicated. To solve the problems it is required to : - Investigate the different sides of physiology of culture-producer of Biologically Active Substances (hereinafter BAS) - Investigate the interrelation between the stirring and biological transformation in microorganism cells - Analyze and search main tendencies required to control biosynthesis (BAS) processes and reproduction of biosynthesis results at the cultivation change scale - Analyze technical properties of the reactor and the revealing of the spatial heterogeneous hydrodynamics situation at the different scales of bioreactor parameters - Investigate cinematic energy mediums field in the different bioreactor scales - Obtain the criteria dependencies estimating the irregularity of the stirrings intensity - Prepare the methodological foundations of microbiological processes forecast required to introduce to the industrial biosynthesis environment Expected results : To detect the comparable regimes of bioreactor operation in order to achieve equal production range and realize the scale-up method

  13. Landscape Forsmark - data, methodology and results for SR-Site

    Energy Technology Data Exchange (ETDEWEB)

    Lindborg, Tobias [ed.; Svensk Kaernbraenslehantering AB (Sweden)

    2010-12-15

    This report presents an integrated description of the landscape at the Forsmark site during the succession from present conditions to the far future. It was produced as a part of the biosphere modelling within the SR-Site safety assessment. The report gives a description of input data, methodology and resulting models used to support the current understanding of the landscape used in SR-Site. It is intended to describe the properties and conditions at the site and to give information essential for demonstrating understanding. The report relies heavily on a number of discipline-specific background reports concerning details of the data analyses and modelling. Long-term landscape development in the Forsmark area is dependent on two main and partly interdependent factors, i.e. climate variations and shoreline displacement. These two factors in combination strongly affect a number of processes, which in turn determine the development of ecosystems. Some examples of such processes are erosion and sedimentation, groundwater recharge and discharge, soil formation, primary production and decomposition of organic matter. The biosphere at the site during the next 1,000 years is assumed to be quite similar to the present situation. The most important changes are the natural infilling of lakes and a slight withdrawal of the sea with its effects on the near-shore areas and the shallow coastal basins. The climate during the rest of the temperate period may vary considerably, with both warmer and colder periods. The main effect of temperature changes will be on the vegetation period. Changed temperatures may give rise to drier or wetter climate and to changed snow cover and frost characteristics, and this can in turn affect the dominant vegetation and mire build-up. The description of the Forsmark ecosystem succession during a glacial cycle is one of the main features of the SR-Site biosphere modelling. The future areas potentially affected by deep groundwater discharge are

  14. Landscape Forsmark - data, methodology and results for SR-Site

    International Nuclear Information System (INIS)

    Lindborg, Tobias

    2010-12-01

    This report presents an integrated description of the landscape at the Forsmark site during the succession from present conditions to the far future. It was produced as a part of the biosphere modelling within the SR-Site safety assessment. The report gives a description of input data, methodology and resulting models used to support the current understanding of the landscape used in SR-Site. It is intended to describe the properties and conditions at the site and to give information essential for demonstrating understanding. The report relies heavily on a number of discipline-specific background reports concerning details of the data analyses and modelling. Long-term landscape development in the Forsmark area is dependent on two main and partly interdependent factors, i.e. climate variations and shoreline displacement. These two factors in combination strongly affect a number of processes, which in turn determine the development of ecosystems. Some examples of such processes are erosion and sedimentation, groundwater recharge and discharge, soil formation, primary production and decomposition of organic matter. The biosphere at the site during the next 1,000 years is assumed to be quite similar to the present situation. The most important changes are the natural infilling of lakes and a slight withdrawal of the sea with its effects on the near-shore areas and the shallow coastal basins. The climate during the rest of the temperate period may vary considerably, with both warmer and colder periods. The main effect of temperature changes will be on the vegetation period. Changed temperatures may give rise to drier or wetter climate and to changed snow cover and frost characteristics, and this can in turn affect the dominant vegetation and mire build-up. The description of the Forsmark ecosystem succession during a glacial cycle is one of the main features of the SR-Site biosphere modelling. The future areas potentially affected by deep groundwater discharge are

  15. Information technology security system engineering methodology

    Science.gov (United States)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  16. Attribute Based Selection of Thermoplastic Resin for Vacuum Infusion Process: A Decision Making Methodology

    DEFF Research Database (Denmark)

    Raghavalu Thirumalai, Durai Prabhakaran; Lystrup, Aage; Løgstrup Andersen, Tom

    2012-01-01

    The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable...... be beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection...... for vacuum infused of a wind turbine blade—is shown to demonstrate the intricacies involved in the proposed methodology for resin selection....

  17. An Evaluation Methodology Development and Application Process for Severe Accident Safety Issue Resolution

    Directory of Open Access Journals (Sweden)

    Robert P. Martin

    2012-01-01

    Full Text Available A general evaluation methodology development and application process (EMDAP paradigm is described for the resolution of severe accident safety issues. For the broader objective of complete and comprehensive design validation, severe accident safety issues are resolved by demonstrating comprehensive severe-accident-related engineering through applicable testing programs, process studies demonstrating certain deterministic elements, probabilistic risk assessment, and severe accident management guidelines. The basic framework described in this paper extends the top-down, bottom-up strategy described in the U.S Nuclear Regulatory Commission Regulatory Guide 1.203 to severe accident evaluations addressing U.S. NRC expectation for plant design certification applications.

  18. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  19. Selecting Health Care Improvement Projects: A Methodology Integrating Cause-and-Effect Diagram and Analytical Hierarchy Process.

    Science.gov (United States)

    Testik, Özlem Müge; Shaygan, Amir; Dasdemir, Erdi; Soydan, Guray

    It is often vital to identify, prioritize, and select quality improvement projects in a hospital. Yet, a methodology, which utilizes experts' opinions with different points of view, is needed for better decision making. The proposed methodology utilizes the cause-and-effect diagram to identify improvement projects and construct a project hierarchy for a problem. The right improvement projects are then prioritized and selected using a weighting scheme of analytical hierarchy process by aggregating experts' opinions. An approach for collecting data from experts and a graphical display for summarizing the obtained information are also provided. The methodology is implemented for improving a hospital appointment system. The top-ranked 2 major project categories for improvements were identified to be system- and accessibility-related causes (45%) and capacity-related causes (28%), respectively. For each of the major project category, subprojects were then ranked for selecting the improvement needs. The methodology is useful in cases where an aggregate decision based on experts' opinions is expected. Some suggestions for practical implementations are provided.

  20. Methodology for processing pressure traces used as inputs for combustion analyses in diesel engines

    International Nuclear Information System (INIS)

    Rašić, Davor; Vihar, Rok; Baškovič, Urban Žvar; Katrašnik, Tomaž

    2017-01-01

    This study proposes a novel methodology for designing an optimum equiripple finite impulse response (FIR) filter for processing in-cylinder pressure traces of a diesel internal combustion engine, which serve as inputs for high-precision combustion analyses. The proposed automated workflow is based on an innovative approach of determining the transition band frequencies and optimum filter order. The methodology is based on discrete Fourier transform analysis, which is the first step to estimate the location of the pass-band and stop-band frequencies. The second step uses short-time Fourier transform analysis to refine the estimated aforementioned frequencies. These pass-band and stop-band frequencies are further used to determine the most appropriate FIR filter order. The most widely used existing methods for estimating the FIR filter order are not effective in suppressing the oscillations in the rate- of-heat-release (ROHR) trace, thus hindering the accuracy of combustion analyses. To address this problem, an innovative method for determining the order of an FIR filter is proposed in this study. This method is based on the minimization of the integral of normalized signal-to-noise differences between the stop-band frequency and the Nyquist frequency. Developed filters were validated using spectral analysis and calculation of the ROHR. The validation results showed that the filters designed using the proposed innovative method were superior compared with those using the existing methods for all analyzed cases. Highlights • Pressure traces of a diesel engine were processed by finite impulse response (FIR) filters with different orders • Transition band frequencies were determined with an innovative method based on discrete Fourier transform and short-time Fourier transform • Spectral analyses showed deficiencies of existing methods in determining the FIR filter order • A new method of determining the FIR filter order for processing pressure traces was

  1. Biodiesel Production from Non-Edible Beauty Leaf (Calophyllum inophyllum Oil: Process Optimization Using Response Surface Methodology (RSM

    Directory of Open Access Journals (Sweden)

    Mohammad I. Jahirul

    2014-08-01

    Full Text Available In recent years, the beauty leaf plant (Calophyllum Inophyllum is being considered as a potential 2nd generation biodiesel source due to high seed oil content, high fruit production rate, simple cultivation and ability to grow in a wide range of climate conditions. However, however, due to the high free fatty acid (FFA content in this oil, the potential of this biodiesel feedstock is still unrealized, and little research has been undertaken on it. In this study, transesterification of beauty leaf oil to produce biodiesel has been investigated. A two-step biodiesel conversion method consisting of acid catalysed pre-esterification and alkali catalysed transesterification has been utilized. The three main factors that drive the biodiesel (fatty acid methyl ester (FAME conversion from vegetable oil (triglycerides were studied using response surface methodology (RSM based on a Box-Behnken experimental design. The factors considered in this study were catalyst concentration, methanol to oil molar ratio and reaction temperature. Linear and full quadratic regression models were developed to predict FFA and FAME concentration and to optimize the reaction conditions. The significance of these factors and their interaction in both stages was determined using analysis of variance (ANOVA. The reaction conditions for the largest reduction in FFA concentration for acid catalysed pre-esterification was 30:1 methanol to oil molar ratio, 10% (w/w sulfuric acid catalyst loading and 75 °C reaction temperature. In the alkali catalysed transesterification process 7.5:1 methanol to oil molar ratio, 1% (w/w sodium methoxide catalyst loading and 55 °C reaction temperature were found to result in the highest FAME conversion. The good agreement between model outputs and experimental results demonstrated that this methodology may be useful for industrial process optimization for biodiesel production from beauty leaf oil and possibly other industrial processes as well.

  2. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  3. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  4. 3. The Formation of Musical Competences: Methodological Approachesin the Process of Artistic-Aesthetic Acquisition

    Directory of Open Access Journals (Sweden)

    Crişciuc Viorica

    2016-03-01

    Full Text Available The article hereby includes conceptual aspects of the musical competences formation. It describes the realization of this process operating with the concepts of well-known occidental, Russian and local researchers. One of the ideas characteristic to the researchers’ pedagogical thinking is that, during the process of musical competence formation through art, the acquisition process mechanism is happening. For integrity in insuring the practical realization at of a musical education, the methodology we propose is based on research, an imposing theoretical network of successful pedagogical practices of remarkable scientists from all over the world. The analyzed theories are a source of inspiration and constitute the theoretical universe which contributes to as truthful as possible musical education.

  5. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  6. Methodology for fire PSA during design process

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Blombach, Joerg

    2009-01-01

    Fire PSA is an essential part of a full scope level 1 PSA. Cable fires play an important role in fire PSA. Usually, cable routing is therefore modeled in detail. During the design of new nuclear power plants the information on cable routing is not yet available. However, for the use of probabilistic safety insights during the design and for licensing purposes a fire PSA may be requested. Therefore a methodology has been developed which makes use of the strictly divisional separation of redundancies in the design of modern nuclear power plants: cable routing is not needed within one division but replaced by the conservative assumption that all equipment fails due to a fire in the concerned division; critical fire areas are defined where components belonging to different divisions may be affected by a fire. For the determination of fire frequencies a component based approach is proposed. The resulting core damage frequencies due to fire are conservative. (orig.)

  7. Methodology for Determining Increases in Radionuclide Inventories for the Effluent Treatment Facility Process

    International Nuclear Information System (INIS)

    Blanchard, A.

    1998-01-01

    A study is currently underway to determine if the Effluent Treatment Facility can be downgraded from a Hazard Category 3 facility to a Radiological Facility per DOE STD-1027-92. This technical report provides a methodology to determine and monitor increases in the radionuclide inventories of the ETF process columns. It also provides guidelines to ensure that other potential increases to the ETF radionuclide inventory are evaluated as required to ensure that the ETF remains a Radiological Facility

  8. Experimental Methodology for Determining Optimum Process Parameters for Production of Hydrous Metal Oxides by Internal Gelation

    Energy Technology Data Exchange (ETDEWEB)

    Collins, J.L.

    2005-10-28

    The objective of this report is to describe a simple but very useful experimental methodology that was used to determine optimum process parameters for preparing several hydrous metal-oxide gel spheres by the internal gelation process. The method is inexpensive and very effective in collection of key gel-forming data that are needed to prepare the hydrous metal-oxide microspheres of the best quality for a number of elements.

  9. Formalizing the ISDF Software Development Methodology

    OpenAIRE

    Mihai Liviu DESPA

    2015-01-01

    The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of wha...

  10. Energy Level Composite Curves-a new graphical methodology for the integration of energy intensive processes

    International Nuclear Information System (INIS)

    Anantharaman, Rahul; Abbas, Own Syed; Gundersen, Truls

    2006-01-01

    Pinch Analysis, Exergy Analysis and Optimization have all been used independently or in combination for the energy integration of process plants. In order to address the issue of energy integration, taking into account composition and pressure effects, the concept of energy level as proposed by [X. Feng, X.X. Zhu, Combining pinch and exergy analysis for process modifications, Appl. Therm. Eng. 17 (1997) 249] has been modified and expanded in this work. We have developed a strategy for energy integration that uses process simulation tools to define the interaction between the various subsystems in the plant and a graphical technique to help the engineer interpret the results of the simulation with physical insights that point towards exploring possible integration schemes to increase energy efficiency. The proposed graphical representation of energy levels of processes is very similar to the Composite Curves of Pinch Analysis-the interpretation of the Energy Level Composite Curves reduces to the Pinch Analysis case when dealing with heat transfer. Other similarities and differences are detailed in this work. Energy integration of a methanol plant is taken as a case study to test the efficacy of this methodology. Potential integration schemes are identified that would have been difficult to visualize without the help of the new graphical representation

  11. Catalytic Reforming: Methodology and Process Development for a Constant Optimisation and Performance Enhancement

    Directory of Open Access Journals (Sweden)

    Avenier Priscilla

    2016-05-01

    Full Text Available Catalytic reforming process has been used to produce high octane gasoline since the 1940s. It would appear to be an old process that is well established and for which nothing new could be done. It is however not the case and constant improvements are proposed at IFP Energies nouvelles. With a global R&D approach using new concepts and forefront methodology, IFPEN is able to: propose a patented new reactor concept, increasing capacity; ensure efficiency and safety of mechanical design for reactor using modelization of the structure; develop new catalysts to increase process performance due to a high comprehension of catalytic mechanism by using, an experimental and innovative analytical approach (119Sn Mössbauer and X-ray absorption spectroscopies and also a Density Functional Theory (DFT calculations; have efficient, reliable and adapted pilots to validate catalyst performance.

  12. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  13. ARAMIS project: A comprehensive methodology for the identification of reference accident scenarios in process industries

    International Nuclear Information System (INIS)

    Delvosalle, Christian; Fievez, Cecile; Pipart, Aurore; Debray, Bruno

    2006-01-01

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term 'major accidents' must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called 'risk matrix', crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage

  14. Comparing Classic and Interval Analytical Hierarchy Process Methodologies for Measuring Area-Level Deprivation to Analyze Health Inequalities.

    Science.gov (United States)

    Cabrera-Barona, Pablo; Ghorbanzadeh, Omid

    2018-01-16

    Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.

  15. A new Methodology for Operations Strategy

    DEFF Research Database (Denmark)

    Koch, Christian; Rytter, Niels Gorm; Boer, Harry

    2005-01-01

    This paper proposes a new methodology for developing and implementing Operations Strategy (OS). It encompasses both content and process aspects of OS and differs thereby from many of the present OS methodologies. The paper outlines its paradigmatic foundation and presents aim, process, dimensions...

  16. APET methodology for Defense Waste Processing Facility: Mode C operation

    International Nuclear Information System (INIS)

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF

  17. A Phenomenological Research Study on Writer's Block: Causes, Processes, and Results

    Science.gov (United States)

    Bastug, Muhammet; Ertem, Ihsan Seyit; Keskin, Hasan Kagan

    2017-01-01

    Purpose: The purpose of this paper is to investigate the causes, processes of writer's block experienced by a group of classroom teacher candidates and its impact on them. Design/methodology/approach: The phenomenological design, which is a qualitative research design, was preferred in the research since it was aimed to investigate the causes,…

  18. D-isoascorbyl palmitate: lipase-catalyzed synthesis, structural characterization and process optimization using response surface methodology.

    Science.gov (United States)

    Sun, Wen-Jing; Zhao, Hong-Xia; Cui, Feng-Jie; Li, Yun-Hong; Yu, Si-Lian; Zhou, Qiang; Qian, Jing-Ya; Dong, Ying

    2013-07-08

    Isoascorbic acid is a stereoisomer of L-ascorbic acid, and widely used as a food antioxidant. However, its highly hydrophilic behavior prevents its application in cosmetics or fats and oils-based foods. To overcome this problem, D-isoascorbyl palmitate was synthesized in the present study for improving the isoascorbic acid's oil solubility with an immobilized lipase in organic media. The structural information of synthesized product was clarified using LC-ESI-MS, FT-IR, 1H and 13C NMR analysis, and process parameters for high yield of D-isoascorbyl palmitate were optimized by using One-factor-at-a-time experiments and response surface methodology (RSM). The synthesized product had the purity of 95% and its structural characteristics were confirmed as isoascorbyl palmitate by LC-ESI-MS, FT-IR, 1H, and 13C NMR analysis. Results from "one-factor-at-a-time" experiments indicated that the enzyme load, reaction temperature and D-isoascorbic-to-palmitic acid molar ratio had a significant effect on the D-isoascorbyl palmitate conversion rate. 95.32% of conversion rate was obtained by using response surface methodology (RSM) under the the optimized condition: enzyme load of 20% (w/w), reaction temperature of 53°C and D- isoascorbic-to-palmitic acid molar ratio of 1:4 when the reaction parameters were set as: acetone 20 mL, 40 g/L of molecular sieves content, 200 rpm speed for 24-h reaction time. The findings of this study can become a reference for developing industrial processes for the preparation of isoascorbic acid ester, which might be used in food additives, cosmetic formulations and for the synthesis of other isoascorbic acid derivatives.

  19. NIF Target Assembly Metrology Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Alger, E. T. [General Atomics, San Diego, CA (United States); Kroll, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dzenitis, E. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Montesanti, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hughes, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Swisher, M. [IAP, Livermore, CA (United States); Taylor, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Segraves, K. [IAP, Livermore, CA (United States); Lord, D. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Castro, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Edwards, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-01-01

    During our inertial confinement fusion (ICF) experiments at the National Ignition Facility (NIF) we require cryogenic targets at the 1-cm scale to be fabricated, assembled, and metrologized to micron-level tolerances. During assembly of these ICF targets, there are physical dimensmetrology is completed using optical coordinate measurement machines that provide repeatable measurements with micron precision, while also allowing in-process data collection for absolute accuracy in assembly. To date, 51 targets have been assembled and metrologized, and 34 targets have been successfully fielded on NIF relying on these metrology data. In the near future, ignition experiments on NIF will require tighter tolerances and more demanding target assembly and metrology capability. Metrology methods, calculations, and uncertainty estimates will be discussed. Target diagnostic port alignment, target position, and capsule location results will be reviewed for the 2009 Energetics Campaign. The information is presented via control charts showing the effect of process improvements that were made during target production. Certain parameters, including capsule position, met the 2009 campaign specifications but will have much tighter requirements in the future. Finally, in order to meet these new requirements assembly process changes and metrology capability upgrades will be necessary.

  20. The Statistical point of view of Quality: the Lean Six Sigma methodology.

    Science.gov (United States)

    Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto

    2015-04-01

    Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.

  1. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: a methodological review.

    Science.gov (United States)

    Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth

    2017-11-28

    The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further

  2. BPMN, Toolsets, and Methodology: A Case Study of Business Process Management in Higher Education

    Science.gov (United States)

    Barn, Balbir S.; Oussena, Samia

    This chapter describes ongoing action research which is exploring the use of BPMN and a specific toolset - Intalio Designer to capture the “as is” essential process model of part of an overarching large business process within higher education. The chapter contends that understanding the efficacy of the BPMN notation and the notational elements to use is not enough. Instead, the effectiveness of a notation is determined by the notation, the toolset that is being used, and methodological consideration. The chapter presents some of the challenges that are faced in attempting to develop computation independent models in BPMN using toolsets such as Intalio Designer™.

  3. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  4. Methodological approach to organizational performance improvement process

    OpenAIRE

    Buble, Marin; Dulčić, Želimir; Pavić, Ivan

    2017-01-01

    Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  5. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  6. Methodology for systematic analysis and improvement of manufacturing unit process life cycle inventory (UPLCI) CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 2: case studies

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    industrial data and engineering calculations for energy use and material loss. This approach is illustrated by means of a case study of a drilling process.The in-depth approach, which leads to more accurate LCI data as well as the identification of potential for environmental improvements...... for environmental improvement based on the in-depth analysis of individual manufacturing unit processes. Two case studies illustrate the applicability of the methodology.......This report presents two case studies, one for both the screening approach and the in-depth approach, demonstrating the application of the life cycle assessment-oriented methodology for systematic inventory analysis of the machine tool use phase of manufacturing unit processes, which has been...

  7. Ethanol production from sweet sorghum bagasse through process optimization using response surface methodology.

    Science.gov (United States)

    Lavudi, Saida; Oberoi, Harinder Singh; Mangamoori, Lakshmi Narasu

    2017-08-01

    In this study, comparative evaluation of acid- and alkali pretreatment of sweet sorghum bagasse (SSB) was carried out for sugar production after enzymatic hydrolysis. Results indicated that enzymatic hydrolysis of alkali-pretreated SSB resulted in higher production of glucose, xylose and arabinose, compared to the other alkali concentrations and also acid-pretreated biomass. Response Surface Methodology (RSM) was, therefore, used to optimize parameters, such as alkali concentration, temperature and time of pretreatment prior to enzymatic hydrolysis to maximize the production of sugars. The independent variables used during RSM included alkali concentration (1.5-4%), pretreatment temperature (125-140 °C) and pretreatment time (10-30 min) were investigated. Process optimization resulted in glucose and xylose concentration of 57.24 and 10.14 g/L, respectively. Subsequently, second stage optimization was conducted using RSM for optimizing parameters for enzymatic hydrolysis, which included substrate concentration (10-15%), incubation time (24-60 h), incubation temperature (40-60 °C) and Celluclast concentration (10-20 IU/g-dwt). Substrate concentration 15%, (w/v) temperature of 60 °C, Celluclast concentration of 20 IU/g-dwt and incubation time of 58 h led to a glucose concentration of 68.58 g/l. Finally, simultaneous saccharification fermentation (SSF) as well as separated hydrolysis and fermentation (SHF) was evaluated using Pichia kudriavzevii HOP-1 for production of ethanol. Significant difference in ethanol concentration was not found using either SSF or SHF; however, ethanol productivity was higher in case of SSF, compared to SHF. This study has established a platform for conducting scale-up studies using the optimized process parameters.

  8. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part II

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    In Part I, an efficient method for identifying faults in large processes was presented. The whole plant is divided into sectors by using structural, functional, or causal decomposition. A signed directed graph (SDG) is the model used for each sector. The SDG represents interactions among process variables. This qualitative model is used to carry out qualitative simulation for all possible faults. The output of this step is information about the process behaviour. This information is used to build rules. When a symptom is detected in one sector, its rules are evaluated using on-line data and fuzzy logic to yield the diagnosis. In this paper the proposed methodology is applied to a multiple stage flash (MSF) desalination process. This process is composed of sequential flash chambers. It was designed for a pilot plant that produces drinkable water for a community in Argentina; that is, it is a real case. Due to the large number of variables, recycles, phase changes, etc., this process is a good challenge for the proposed diagnosis method

  9. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  10. [Optimization of process of icraiin be hydrolyzed to Baohuoside I by cellulase based on Plackett-Burman design combined with CCD response surface methodology].

    Science.gov (United States)

    Song, Chuan-xia; Chen, Hong-mei; Dai, Yu; Kang, Min; Hu, Jia; Deng, Yun

    2014-11-01

    To optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase by Plackett-Burman design combined with Central Composite Design (CCD) response surface methodology. To select the main influencing factors by Plackett-Burman design, using CCD response surface methodology to optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase. Taking substrate concentration, the pH of buffer and reaction time as independent variables, with conversion rate of icariin as dependent variable,using regression fitting of completely quadratic response surface between independent variable and dependent variable,the optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase was intuitively analyzed by 3D surface chart, and taking verification tests and predictive analysis. The best enzymatic hydrolytic process was as following: substrate concentration 8. 23 mg/mL, pH 5. 12 of buffer,reaction time 35. 34 h. The optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase is determined by Plackett-Burman design combined with CCD response surface methodology. The optimized enzymatic hydrolytic process is simple, convenient, accurate, reproducible and predictable.

  11. Developing knowledge management systems with an active expert methodology

    International Nuclear Information System (INIS)

    Sandahl, K.

    1992-01-01

    Knowledge management, understood as the ability to store, distribute and utilize human knowledge in an organization, is the subject of this dissertation. In particular we have studied the design of methods and supporting software for this process. Detailed and systematic description of the design and development processes of three case-study implementations of knowledge management software are provided. The outcome of the projects is explained in terms of an active expert development methodology, which is centered around support for a domain expert to take substantial responsibility for the design and maintenance of a knowledge management system in a given area of application. Based on the experiences from the case studies and the resulting methodology, an environment for automatically supporting knowledge management was designed in the KNOWLEDGE-LINKER research project. The vital part of this architecture is a knowledge acquisition tool, used directly by the experts in creating and maintaining a knowledge base. An elaborated version of the active expert development methodology was then formulated as the result of applying the KNOWLEDGE-LINKER approach in a fourth case study. This version of the methodology is also accounted for and evaluated together within the supporting KNOWLEDGE-LINKER architecture. (au)

  12. Value and Vision-based Methodology in Integrated Design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    on empirical data from workshop where the Value and Vision-based methodology has been taught. The research approach chosen for this investigation is Action Research, where the researcher plays an active role in generating the data and gains a deeper understanding of the investigated phenomena. The result...... of this thesis is the value transformation from an explicit set of values to a product concept using a vision based concept development methodology based on the Pyramid Model (Lerdahl, 2001) in a design team context. The aim of this thesis is to examine how the process of value transformation is occurring within...... is divided in three; the systemic unfolding of the Value and Vision-based methodology, the structured presentation of practical implementation of the methodology and finally the analysis and conclusion regarding the value transformation, phenomena and learning aspects of the methodology....

  13. Methodological approach to organizational performance improvement process

    Directory of Open Access Journals (Sweden)

    Marin Buble

    2001-01-01

    Full Text Available Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  14. [Optimization of Formulation and Process of Paclitaxel PEGylated Liposomes by Box-Behnken Response Surface Methodology].

    Science.gov (United States)

    Shi, Ya-jun; Zhang, Xiao-feil; Guo, Qiu-ting

    2015-12-01

    To develop a procedure for preparing paclitaxel encapsulated PEGylated liposomes. The membrane hydration followed extraction method was used to prepare PEGylated liposomes. The process and formulation variables were optimized by "Box-Behnken Design (BBD)" of response surface methodology (RSM) with the amount of Soya phosphotidylcholine (SPC) and PEG2000-DSPE as well as the rate of SPC to drug as independent variables and entrapment efficiency as dependent variables for optimization of formulation variables while temperature, pressure and cycle times as independent variables and particle size and polydispersion index as dependent variables for process variables. The optimized liposomal formulation was characterized for particle size, Zeta potential, morphology and in vitro drug release. For entrapment efficiency, particle size, polydispersion index, Zeta potential, and in vitro drug release of PEGylated liposomes was found to be 80.3%, (97.15 ± 14.9) nm, 0.117 ± 0.019, (-30.3 ± 3.7) mV, and 37.4% in 24 h, respectively. The liposomes were found to be small, unilamellar and spherical with smooth surface as seen in transmission electron microscopy. The Box-Behnken response surface methodology facilitates the formulation and optimization of paclitaxel PEGylated liposomes.

  15. Complex methodology of the model elaboration of the quantified transnationalization process assessment

    Directory of Open Access Journals (Sweden)

    Larysa Rudenko-Sudarieva

    2009-03-01

    Full Text Available In the article there are studied the theoretical fundamentals of transnationalization, the peculiarities of its development based on the studying of the world theory and practices; suggested a systematic approach of the methodical background as for determination of the economic category of «transnationalization» and its author’s definition; developed a complex methodology of the model building of the quantified transnationalization process assessment based on the seven-milestone algorithm of the formation of key indicators; systematized and carried out synthesis of the empiric investigations concerning the state, development of the available tendencies, comparative analysis of the transnationalization level within the separate TNC’s groups.

  16. Methodological development of the process of appreciation of photography Conceptions

    Directory of Open Access Journals (Sweden)

    Yovany Álvarez García

    2012-12-01

    Full Text Available This article discusses the different concepts that are used to methodological appreciation of photography. Since photography is one of the manifestations of the visu al arts with the most commonly interacts daily ; from which can be found in books, magazines and other publications, discusses various methodologies to assess the photographic image. It addresses also the classic themes of photography as well as some expres sive elements.

  17. Optimisation of process parameters on thin shell part using response surface methodology (RSM)

    Science.gov (United States)

    Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.

    2017-09-01

    This study is carried out to focus on optimisation of process parameters by simulation using Autodesk Moldflow Insight (AMI) software. The process parameters are taken as the input in order to analyse the warpage value which is the output in this study. There are some significant parameters that have been used which are melt temperature, mould temperature, packing pressure, and cooling time. A plastic part made of Polypropylene (PP) has been selected as the study part. Optimisation of process parameters is applied in Design Expert software with the aim to minimise the obtained warpage value. Response Surface Methodology (RSM) has been applied in this study together with Analysis of Variance (ANOVA) in order to investigate the interactions between parameters that are significant to the warpage value. Thus, the optimised warpage value can be obtained using the model designed using RSM due to its minimum error value. This study comes out with the warpage value improved by using RSM.

  18. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Carlo Boaretti

    2015-07-01

    Full Text Available In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate and material (sulfonation degree variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers.

  19. Application of electrochemical peroxidation (ECP) process for waste-activated sludge stabilization and system optimization using response surface methodology (RSM).

    Science.gov (United States)

    Gholikandi, Gagik Badalians; Kazemirad, Khashayar

    2018-03-01

    In this study, the performance of the electrochemical peroxidation (ECP) process for removing the volatile suspended solids (VSS) content of waste-activated sludge was evaluated. The Fe 2+ ions required by the process were obtained directly from iron electrodes in the system. The performance of the ECP process was investigated in various operational conditions employing a laboratory-scale pilot setup and optimized by response surface methodology (RSM). According to the results, the ECP process showed its best performance when the pH value, current density, H 2 O 2 concentration and the retention time were 3, 3.2 mA/cm 2 , 1,535 mg/L and 240 min, respectively. In these conditions, the introduced Fe 2+ concentration was approximately 500 (mg/L) and the VSS removal efficiency about 74%. Moreover, the results of the microbial characteristics of the raw and the stabilized sludge demonstrated that the ECP process is able to remove close to 99.9% of the coliforms in the raw sludge during the stabilization process. The energy consumption evaluation showed that the required energy of the ECP reactor (about 1.8-2.5 kWh (kg VSS removed) -1 ) is considerably lower than for aerobic digestion, the conventional waste-activated sludge stabilization method (about 2-3 kWh (kg VSS removed) -1 ). The RSM optimization process showed that the best operational conditions of the ECP process comply with the experimental results, and the actual and the predicted results are in good conformity with each other. This feature makes it possible to predict the introduced Fe 2+ concentrations into the system and the VSS removal efficiency of the process precisely.

  20. Hydrologic testing methodology and results from deep basalt boreholes

    International Nuclear Information System (INIS)

    Strait, S.R.; Spane, F.A.; Jackson, R.L.; Pidcoe, W.W.

    1982-05-01

    The objective of the hydrologic field-testing program is to provide data for characterization of the groundwater systems wihin the Pasco Basin that are significant to understanding waste isolation. The effort is directed toward characterizing the areal and vertical distributions of hydraulic head, hydraulic properties, and hydrochemistry. Data obtained from these studies provide input for numerical modeling of groundwater flow and solute transport. These models are then used for evaluating potential waste migration as a function of space and time. The groundwater system beneath the Hanford Site and surrounding area consists of a thick, accordantly layered sequence of basalt flows and associated sedimentary interbed that primarily occur in the upper part of the Columbia River basalt. Permeable horizons of the sequence are associated with the interbeds and the interflow zones within the basalt. The columnar interiors of a flow act as low-permeability aquitards, separating the more-permeable interflows or interbeds. This paper discusses the hydrologic field-gathering activities, specifically, field-testing methodology and test results from deep basalt boreholes

  1. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Estimation of the daily global solar radiation based on the Gaussian process regression methodology in the Saharan climate

    Science.gov (United States)

    Guermoui, Mawloud; Gairaa, Kacem; Rabehi, Abdelaziz; Djafer, Djelloul; Benkaciali, Said

    2018-06-01

    Accurate estimation of solar radiation is the major concern in renewable energy applications. Over the past few years, a lot of machine learning paradigms have been proposed in order to improve the estimation performances, mostly based on artificial neural networks, fuzzy logic, support vector machine and adaptive neuro-fuzzy inference system. The aim of this work is the prediction of the daily global solar radiation, received on a horizontal surface through the Gaussian process regression (GPR) methodology. A case study of Ghardaïa region (Algeria) has been used in order to validate the above methodology. In fact, several combinations have been tested; it was found that, GPR-model based on sunshine duration, minimum air temperature and relative humidity gives the best results in term of mean absolute bias error (MBE), root mean square error (RMSE), relative mean square error (rRMSE), and correlation coefficient ( r) . The obtained values of these indicators are 0.67 MJ/m2, 1.15 MJ/m2, 5.2%, and 98.42%, respectively.

  3. How methodological issues affect the energy indicator results for different electricity generation technologies

    International Nuclear Information System (INIS)

    Modahl, Ingunn Saur; Raadal, Hanne Lerche; Gagnon, Luc; Bakken, Tor Haakon

    2013-01-01

    The aim of this paper is to improve the basis for the comparison of energy products. The paper will discuss important methodological issues with regard to various energy indicators and it will, by means of a few selected energy indicators, show examples of results for hydropower, wind power and electricity from biomass, gas and coal. Lastly it will suggest methods to achieve results which are more consistent when comparing electricity production technologies. In general, methodological issues can affect the results of life cycle assessments. In this paper, the authors have focused on the effect of system boundaries for energy indicators and found that the internal ranking of cases within one electricity generation technology is dependent on the indicator used. These variations do not, however, alter the general ranking of the major technologies studied. The authors suggest that future assessments should focus on a smaller set of indicators: the Cumulative Energy Demand (CED), which is the most “universal” indicator, Energy Payback Ratio (EPR) for assessment of upstream activities, and a suggested “Cumulative Fossil Energy Demand” (CFED) for resource depletion assessments. There is also a need for stricter standardisation and increased transparency in the assessment of energy products. - Highlights: • There is a need for stricter standardisation of energy performance assessments. • System boundaries for renewable sources should be harmonised. • One should focus on a smaller set of indicators. CED should be included

  4. Methodology for setup and data processing of mobile air quality measurements to assess the spatial variability of concentrations in urban environments

    International Nuclear Information System (INIS)

    Van Poppel, Martine; Peters, Jan; Bleux, Nico

    2013-01-01

    A case study is presented to illustrate a methodology for mobile monitoring in urban environments. A dataset of UFP, PM 2.5 and BC concentrations was collected. We showed that repeated mobile measurements could give insight in spatial variability of pollutants at different micro-environments in a city. Streets of contrasting traffic intensity showed increased concentrations by a factor 2–3 for UFP and BC and by 2.5 . The first quartile (P25) of the mobile measurements at an urban background zone seems to be good estimate of the urban background concentration. The local component of the pollutant concentrations was determined by background correction. The use of background correction reduced the number of runs needed to obtain representative results. The results presented, are a first attempt to establish a methodology for setup and data processing of mobile air quality measurements to assess the spatial variability of concentrations in urban environments. -- Highlights: ► Mobile measurements are used to assess the variability of air pollutants in urban environments. ► PM 2.5 , BC and UFP concentrations are presented for zones with different traffic characteristics. ► A methodology for background correction based on the mobile measurements is presented. ► The background concentration is estimated as the 25th percentile of the urban background data. ► The minimum numbers of runs for a representative estimate is reduced after background correction. -- This paper shows that the spatial variability of air pollutants in an urban environment can be assessed by a mobile monitoring methodology including background correction

  5. Methodological foundations of target market enterprise orientation

    OpenAIRE

    N.V. Karpenko

    2012-01-01

    In the article the author determines the importance of target market orientation maintenance which content is based on marketing principles and envisages the interrelationship of market segmentation processes and positioning. Proposed methodological principles of segmentation implementation are the result of the authors own research, and the process of positioning is examined through the five-level system that contains three stages and two variants of organizational behavior.

  6. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  7. Impact of methodology on the results of economic evaluations of varicella vaccination programs: is it important for decision-making?

    Directory of Open Access Journals (Sweden)

    Patrícia Coelho de Soárez

    2009-01-01

    Full Text Available This study aims to review the literature on economic evaluation of childhood varicella vaccination programs and to discuss how heterogeneity in methodological aspects and estimation of parameters can affect the studies' results. After applying the inclusion criteria, 27 studies published from 1980 to 2008 were analyzed in relation to methodological differences. There was great heterogeneity in the perspective adopted, evaluation of indirect costs, type of model used, modeling of the effect on herpes zoster, and estimation of vaccine price and efficacy parameters. The factor with the greatest impact on results was the inclusion of indirect costs, followed by the perspective adopted and vaccine price. The choice of a particular methodological aspect or parameter affected the studies' results and conclusions. It is essential that authors present these choices transparently so that users of economic evaluations understand the implications of such choices and the direction in which the results of the analysis were conducted.

  8. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    Science.gov (United States)

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  9. Improvement in the incident reporting and investigation procedures using process excellence (DMAI2C) methodology

    International Nuclear Information System (INIS)

    Miles, Elizabeth N.

    2006-01-01

    In 1996, Health and Safety introduced an incident investigation process called Learning to Look ( C) to Johnson and Johnson. This process provides a systematic way of analyzing work-related injuries and illness, uncovers root cause that leads to system defects, and points to viable solutions. The process analyzed involves three steps: investigation and reporting of the incident, determination of root cause, and development and implementation of a corrective action plan. The process requires the investigators to provide an initial communication for work-related serious injuries and illness as well as lost workday cases to Corporate Headquarters within 72h of the incident with a full investigative report to follow within 10 days. A full investigation requires a written report, a cause-result logic diagram (CRLD), a corrective action plan (CAP) and a report of incident costs (SafeCost) all due to be filed electronically. It is incumbent on the principal investigator and his or her investigative teams to assemble the various parts of the investigation and to follow up with the relevant parties to ensure corrective actions are implemented, and a full report submitted to Corporate executives. Initial review of the system revealed that the process was not working as designed. A number of reports were late, not signed by the business leaders, and in some instances, all cause were not identified. Process excellence was the process used to study the issue. The team used six sigma DMAI 2 C methodologies to identify and implement system improvements. The project examined the breakdown of the critical aspects of the reporting and investigation process that lead to system errors. This report will discuss the study findings, recommended improvements, and methods used to monitor the new improved process

  10. Decolorization and mineralization of Diarylide Yellow 12 (PY12) by photo-Fenton process: the Response Surface Methodology as the optimization tool.

    Science.gov (United States)

    GilPavas, Edison; Dobrosz-Gómez, Izabela; Gómez-García, Miguel Ángel

    2012-01-01

    The Response Surface Methodology (RSM) was applied as a tool for the optimization of the operational conditions of the photo-degradation of highly concentrated PY12 wastewater, resulting from a textile industry located in the suburbs of Medellin (Colombia). The Box-Behnken experimental Design (BBD) was chosen for the purpose of response optimization. The photo-Fenton process was carried out in a laboratory-scale batch photo-reactor. A multifactorial experimental design was proposed, including the following variables: the initial dyestuff concentration, the H(2)O(2) and the Fe(+2) concentrations, as well as the UV wavelength radiation. The photo-Fenton process performed at the optimized conditions resulted in ca. 100% of dyestuff decolorization, 92% of COD and 82% of TOC degradation. A kinetic study was accomplished, including the identification of some intermediate compounds generated during the oxidation process. The water biodegradability reached a final DBO(5)/DQO = 0.86 value.

  11. Managerial Methodology in Public Institutions

    Directory of Open Access Journals (Sweden)

    Ion VERBONCU

    2010-10-01

    Full Text Available One of the most important ways of making public institutions more efficient is by applying managerial methodology, embodied in the promotion of management tools, modern and sophisticated methodologies, as well as operation of designing/redesigning and maintenance of the management process and its components. Their implementation abides the imprint of constructive and functional particularities of public institutions, decentralized and devolved, and, of course, the managers’ expertise of these organizations. Managerial methodology is addressed through three important instruments diagnosis, management by objectives and scoreboard. Its presence in the performance management process should be mandatory, given the favorable influence on the management and economic performance and the degree of scholastic approach of the managers’ performance.

  12. Car-borne prospecting methodology. First results from the Cuenca del Nordeste (Uruguay)

    International Nuclear Information System (INIS)

    Goso, H.; Spoturno, J.; Preciozzi, F.

    1976-01-01

    The paper reports on a prospecting procedure used in Uruguay for the selection of areas of interest and on the first results obtained in the Cuenca del Nordeste (North-East Basin). The methodology developed (preliminary radiometric investigation) consists of four successive stages: compilation of material; radiometric survey; statistical analysis; revision of anomalies. The compilation of material has the aim of obtaining geological and cartographic data and of laying out the network of tracks of the area to be prospected. The radiometric survey provides data by means of a car-borne recording scintillometer, together with geological information necessary for preparing the geological map. By statistical analysis of the results it is possible to define and quantify various types of anomalies in a simple manner. Use of a log-normal model yielded a highly logical and coherent approximation in the treatment of the data obtained, and a classification of the defined anomalies in order of importance. Anomaly revision is carried out on first and second order anomalies, and on those of the third order deemed to be significant. This methodology has been developed and is in use on sedimentary formations of the Devonian, Gondwana and Cretaceous, where there are various problems presented by the overburden and the grid size ranges between 1 km and 4-8 km 2 . In the particular case of the Cuenca del Nordeste (Gondwana), with no background of uranium mineralization, it was possible to select a zone of some 1000 km 2 in the San Gregorio-Tres Islas formation with a view to carrying out more detailed work. (author)

  13. Process optimization of microencapsulation of curcumin in γ-polyglutamic acid using response surface methodology.

    Science.gov (United States)

    Ko, Wen-Ching; Chang, Chao-Kai; Wang, Hsiu-Ju; Wang, Shian-Jen; Hsieh, Chang-Wei

    2015-04-01

    The aim of this study was to develop an optimal microencapsulation method for an oil-soluble component (curcumin) using γ-PGA. The results show that Span80 significantly enhances the encapsulation efficiency (EE) of γ-Na(+)-PGA microcapsules. Therefore, the effects of γ-Na(+)-PGA, curcumin and Span80 concentration on EE of γ-Na(+)-PGA microcapsules were studied by means of response surface methodology (RSM). It was found that the optimal microencapsulation process is achieved by using γ-Na(+)-PGA 6.05%, curcumin 15.97% and Span80 0.61% with a high EE% (74.47 ± 0.20%). Furthermore, the models explain 98% of the variability in the responses. γ-Na(+)-PGA seems to be a good carrier for the encapsulation of curcumin. In conclusion, this simple and versatile approach can potentially be applied to the microencapsulation of various oil-soluble components for food applications. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Post-Sale Customer Support Methodology in the TQM System

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Elizabeta Mitreva

    2014-06-01

    Full Text Available In this paper a survey of the activities in the post-sale period of the product is made and based on the analysis of the results, a methodology that managers could use to design and implement the system of total quality management has been created. The implementation of this methodology is carried out in a simplified way and in less time, without having to study and deepen new knowledge for internal standardization, statistical process control, cost analysis and optimization of business processes The purpose of this paper is to lay a good foundation for Macedonian companies in their post-sale period activities of the product, to understand the philosophy of TQM (Total Quality Management and benefits will be achieved by implementing the system and setting strategic directions for success. These activities begin by identifying the wishes and needs of customers/users, reengineering business processes for sales support, satisfaction of employees and all stakeholders. As a result of the implementation of this methodology in practice, improved competitiveness, increased efficiency, reduction of quality costs and increased productivity are noted. The methodology proposed in this paper brings together all the activities in the spiral of quality in a company that deals with post-sales support. Due to the necessity of flow of information about quality in the entire enterprise, an information system is designed accordingly to the QC-CEPyramid model in several steps.

  15. Verification results of methodology for determining the weighted mean coolant temperature in the primary circuit hot legs of WWER-1000 reactor plants

    International Nuclear Information System (INIS)

    Saunin, Yuri V.; Dobrotvorski, Alexander N.; Semenikhin, Alexander V.; Korolev, Alexander S.

    2017-01-01

    The JSC ''Atomtechenergo'' experts have developed a new methodology for determining the weighted mean coolant temperature in the primary circuit hot legs of WWER-1000 reactor plants. The necessity for developing the new methodology was determined by the need to decrease the calculation error of the weighted mean coolant temperature in the hot legs because of the coolant temperature stratification. The methodology development was based on the findings of experimental and calculating research executed by the authors. The methodology verification was fulfilled through comparison of calculation results obtained with and without the methodology use in various operational states and modes of several WWER-1000 power units. The obtained verification results have confirmed that the use of the new methodology provides objective error decrease in determining the weighted mean coolant temperature in the primary circuit hot legs. The decrease value depends on the stratification character which is various for different objects and conditions.

  16. Verification results of methodology for determining the weighted mean coolant temperature in the primary circuit hot legs of WWER-1000 reactor plants

    Energy Technology Data Exchange (ETDEWEB)

    Saunin, Yuri V.; Dobrotvorski, Alexander N.; Semenikhin, Alexander V.; Korolev, Alexander S. [JSC ' ' Atomtechenergo' ' , Novovoronezh (Russian Federation). Novovoronezh Filial ' ' Novovoronezhatomtechenergo' ' ; Ryasny, Sergei I. [JSC ' ' Atomtechenergo' ' , Moscow (Russian Federation)

    2017-09-15

    The JSC ''Atomtechenergo'' experts have developed a new methodology for determining the weighted mean coolant temperature in the primary circuit hot legs of WWER-1000 reactor plants. The necessity for developing the new methodology was determined by the need to decrease the calculation error of the weighted mean coolant temperature in the hot legs because of the coolant temperature stratification. The methodology development was based on the findings of experimental and calculating research executed by the authors. The methodology verification was fulfilled through comparison of calculation results obtained with and without the methodology use in various operational states and modes of several WWER-1000 power units. The obtained verification results have confirmed that the use of the new methodology provides objective error decrease in determining the weighted mean coolant temperature in the primary circuit hot legs. The decrease value depends on the stratification character which is various for different objects and conditions.

  17. An integrated methodology for process improvement and delivery system visualization at a multidisciplinary cancer center.

    Science.gov (United States)

    Singprasong, Rachanee; Eldabi, Tillal

    2013-01-01

    Multidisciplinary cancer centers require an integrated, collaborative, and stream-lined workflow in order to provide high quality of patient care. Due to the complex nature of cancer care and continuing changes to treatment techniques and technologies, it is a constant struggle for centers to obtain a systemic and holistic view of treatment workflow for improving the delivery systems. Project management techniques, Responsibility matrix and a swim-lane activity diagram representing sequence of activities can be combined for data collection, presentation, and evaluation of the patient care. This paper presents this integrated methodology using multidisciplinary meetings and walking the route approach for data collection, integrated responsibility matrix and swim-lane activity diagram with activity time for data representation and 5-why and gap analysis approach for data analysis. This enables collection of right detail of information in a shorter time frame by identifying process flaws and deficiencies while being independent of the nature of the patient's disease or treatment techniques. A case study of a multidisciplinary regional cancer centre is used to illustrate effectiveness of the proposed methodology and demonstrates that the methodology is simple to understand, allowing for minimal training of staff and rapid implementation. © 2011 National Association for Healthcare Quality.

  18. MAIA - Method for Architecture of Information Applied: methodological construct of information processing in complex contexts

    Directory of Open Access Journals (Sweden)

    Ismael de Moura Costa

    2017-04-01

    Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.

  19. Methodology and main results of seismic source characterization for the PEGASOS Project, Switzerland

    International Nuclear Information System (INIS)

    Coppersmith, K. J.; Youngs, R. R.; Sprecher, Ch.

    2009-01-01

    Under the direction of the National Cooperative for the Disposal of Radioactive Waste (NAGRA), a probabilistic seismic hazard analysis was conducted for the Swiss nuclear power plant sites. The study has become known under the name 'PEGASOS Project'. This is the first of a group of papers in this volume that describes the seismic source characterization methodology and the main results of the project. A formal expert elicitation process was used, including dissemination of a comprehensive database, multiple workshops for identification and discussion of alternative models and interpretations, elicitation interviews, feedback to provide the experts with the implications of their preliminary assessments, and full documentation of the assessments. A number of innovative approaches to the seismic source characterization methodology were developed by four expert groups and implemented in the study. The identification of epistemic uncertainties and treatment using logic trees were important elements of the assessments. Relative to the assessment of the seismotectonic framework, the four expert teams identified similar main seismotectonic elements: the Rhine Graben, the Jura / Molasse regions, Helvetic and crystalline subdivisions of the Alps, and the southern Germany region. In defining seismic sources, the expert teams used a variety of approaches. These range from large regional source zones having spatially-smoothed seismicity to smaller local zones, to account for spatial variations in observed seismicity. All of the teams discussed the issue of identification of feature-specific seismic sources (i.e. individual mapped faults) as well as the potential reactivation of the boundary faults of the Permo-Carboniferous grabens. Other important seismic source definition elements are the specification of earthquake rupture dimensions and the earthquake depth distribution. Maximum earthquake magnitudes were assessed for each seismic source using approaches that consider the

  20. Design process dynamics in an experience-based context : a design methodological analysis of the Brabantia corkscrew development

    NARCIS (Netherlands)

    Vries, de M.J.

    1994-01-01

    In design methodology, the influence of various factors on design processes is studied. In this article the design of the Brabantia corkscrew is presented as a case study in which these factors are analysed. The aim of the analysis is to gain insight into the way Brabantia took these factors into

  1. Insights from implementation of a risk management methodology

    International Nuclear Information System (INIS)

    Mahn, J.A.; Germann, R.P.; Jacobs, R.R.

    1992-01-01

    In 1988, GPU Nuclear (GPUN) Corporation embarked on a research effort to identify or develop an appropriate methodology for proactively managing risks. The objective of this effort was to increase its ability to identify potential risks and to aid resource allocation decision making for risk control. Such a methodology was presented at a risk management symposium sponsored by GPUN in September of 1989. A pilot project based on this methodology has been conducted at GPUN to test and validate the elements of the methodology and to compare the results of its application with current corporate methods for guiding risk decision making. The pilot project also led to a follow-up policy-capturing study to elicit information about the various risk decision-making models of GPUN decision makers. The combination of these endeavors provided an opportunity to gain numerous insights with respect to understanding the real value of a risk management process, obtaining acceptance of and commitment to risk management and improving operational aspects of the methodology

  2. A generic statistical methodology to predict the maximum pit depth of a localized corrosion process

    International Nuclear Information System (INIS)

    Jarrah, A.; Bigerelle, M.; Guillemot, G.; Najjar, D.; Iost, A.; Nianga, J.-M.

    2011-01-01

    Highlights: → We propose a methodology to predict the maximum pit depth in a corrosion process. → Generalized Lambda Distribution and the Computer Based Bootstrap Method are combined. → GLD fit a large variety of distributions both in their central and tail regions. → Minimum thickness preventing perforation can be estimated with a safety margin. → Considering its applications, this new approach can help to size industrial pieces. - Abstract: This paper outlines a new methodology to predict accurately the maximum pit depth related to a localized corrosion process. It combines two statistical methods: the Generalized Lambda Distribution (GLD), to determine a model of distribution fitting with the experimental frequency distribution of depths, and the Computer Based Bootstrap Method (CBBM), to generate simulated distributions equivalent to the experimental one. In comparison with conventionally established statistical methods that are restricted to the use of inferred distributions constrained by specific mathematical assumptions, the major advantage of the methodology presented in this paper is that both the GLD and the CBBM enable a statistical treatment of the experimental data without making any preconceived choice neither on the unknown theoretical parent underlying distribution of pit depth which characterizes the global corrosion phenomenon nor on the unknown associated theoretical extreme value distribution which characterizes the deepest pits. Considering an experimental distribution of depths of pits produced on an aluminium sample, estimations of maximum pit depth using a GLD model are compared to similar estimations based on usual Gumbel and Generalized Extreme Value (GEV) methods proposed in the corrosion engineering literature. The GLD approach is shown having smaller bias and dispersion in the estimation of the maximum pit depth than the Gumbel approach both for its realization and mean. This leads to comparing the GLD approach to the GEV one

  3. The development of methodological tools to assess the health sector with the resulting standardized index

    Directory of Open Access Journals (Sweden)

    Hansuvarova Evgenia Adolfovna

    2016-10-01

    The proposed assessment methodology resulting standardized health index in the various countries of the world allows you to define the country implementing an effective management strategy in the health sector. The leading positions belong to the countries where the state health policy has shown its greatest efficiency. This technique can be used not only for point scoring result of a standardized health index in the world, but also to assess in a particular country.

  4. Systematic methodology and property prediction of fatty systems for process design/analysis in the oil and fat industry

    DEFF Research Database (Denmark)

    Díaz Tovar, Carlos Axel; Ceriani, Roberta; Gani, Rafiqul

    2010-01-01

    in the vegetable oil were defined. Basic and critical properties were then computed by means of appropriate property prediction software. Temperature dependant properties were modeled using and extending available correlations. The process model was developed through the PRO II commercial simulator and validated......A systematic model based methodology has been developed and its application highlighted through the solvent recovery section of a soybean oil extraction process, with emphasis on the effect of design variables on the process performance. First, the most representative compounds present...

  5. Methodological Approaches to Experimental Teaching of Mathematics to University Students

    Directory of Open Access Journals (Sweden)

    Nikolay I.

    2018-03-01

    Full Text Available Introduction: the article imparts authors’ thoughtson a new teaching methodology for mathematical education in universities. The aim of the study is to substantiate the efficiency of the comprehensive usage of mathematical electronic courses, computer tests, original textbooks and methodologies when teaching mathematics to future agrarian engineers. The authors consider this implementation a unified educational process. Materials and Methods: the synthesis of international and domestic pedagogical experience of teaching students in university and the following methods of empirical research were used: pedagogical experiment, pedagogical measurementsand experimental teaching of mathematics. The authors applied the methodology of revealing interdisciplinary links on the continuum of mathematical problems using the key examples and exercises. Results: the online course “Mathematics” was designed and developed on the platform of Learning Management System Moodle. The article presents the results of test assignments assessing students’ intellectual abilities and analysis of solutions of various types of mathematical problems by students. The pedagogical experiment substantiated the integrated selection of textbooks, online course and online tests using the methodology of determination of the key examples and exercises. Discussion and Conclusions: the analysis of the experimental work suggested that the new methodology is able to have positive effect on the learning process. The learning programme determined the problem points for each student. The findings of this study have a number of important implications for future educational practice.

  6. Process Optimization of Eco-Friendly Flame Retardant Finish for Cotton Fabric: a Response Surface Methodology Approach

    Science.gov (United States)

    Yasin, Sohail; Curti, Massimo; Behary, Nemeshwaree; Perwuelz, Anne; Giraud, Stephane; Rovero, Giorgio; Guan, Jinping; Chen, Guoqiang

    The n-methylol dimethyl phosphono propionamide (MDPA) flame retardant compounds are predominantly used for cotton fabric treatments with trimethylol melamine (TMM) to obtain better crosslinking and enhanced flame retardant properties. Nevertheless, such treatments are associated with a toxic issue of cancer-causing formaldehyde release. An eco-friendly finishing was used to get formaldehyde-free fixation of flame retardant to the cotton fabric. Citric acid as a crosslinking agent along with the sodium hypophosphite as a catalyst in the treatment was utilized. The process parameters of the treatment were enhanced for optimized flame retardant properties, in addition, low mechanical loss to the fabric by response surface methodology using Box-Behnken statistical design experiment methodology was achieved. The effects of concentrations on the fabric’s properties (flame retardancy and mechanical properties) were evaluated. The regression equations for the prediction of concentrations and mechanical properties of the fabric were also obtained for the eco-friendly treatment. The R-squared values of all the responses were above 0.95 for the reagents used, indicating the degree of relationship between the predicted values by the Box-Behnken design and the actual experimental results. It was also found that the concentration parameters (crosslinking reagents and catalysts) in the treatment formulation have a prime role in the overall performance of flame retardant cotton fabrics.

  7. Automated microscopic characterization of metallic ores with image analysis: a key to improve ore processing. I: test of the methodology

    International Nuclear Information System (INIS)

    Berrezueta, E.; Castroviejo, R.

    2007-01-01

    Ore microscopy has traditionally been an important support to control ore processing, but the volume of present day processes is beyond the reach of human operators. Automation is therefore compulsory, but its development through digital image analysis, DIA, is limited by various problems, such as the similarity in reflectance values of some important ores, their anisotropism, and the performance of instruments and methods. The results presented show that automated identification and quantification by DIA are possible through multiband (RGB) determinations with a research 3CCD video camera on reflected light microscope. These results were obtained by systematic measurement of selected ores accounting for most of the industrial applications. Polarized light is avoided, so the effects of anisotropism can be neglected. Quality control at various stages and statistical analysis are important, as is the application of complementary criteria (e.g. metallogenetic). The sequential methodology is described and shown through practical examples. (Author)

  8. Application of response surface methodology for optimization of natural organic matter degradation by UV/H2O2 advanced oxidation process.

    Science.gov (United States)

    Rezaee, Reza; Maleki, Afshin; Jafari, Ali; Mazloomi, Sajad; Zandsalimi, Yahya; Mahvi, Amir H

    2014-01-01

    In this research, the removal of natural organic matter from aqueous solutions using advanced oxidation processes (UV/H2O2) was evaluated. Therefore, the response surface methodology and Box-Behnken design matrix were employed to design the experiments and to determine the optimal conditions. The effects of various parameters such as initial concentration of H2O2 (100-180 mg/L), pH (3-11), time (10-30 min) and initial total organic carbon (TOC) concentration (4-10 mg/L) were studied. Analysis of variance (ANOVA), revealed a good agreement between experimental data and proposed quadratic polynomial model (R(2) = 0.98). Experimental results showed that with increasing H2O2 concentration, time and decreasing in initial TOC concentration, TOC removal efficiency was increased. Neutral and nearly acidic pH values also improved the TOC removal. Accordingly, the TOC removal efficiency of 78.02% in terms of the independent variables including H2O2 concentration (100 mg/L), pH (6.12), time (22.42 min) and initial TOC concentration (4 mg/L) were optimized. Further confirmation tests under optimal conditions showed a 76.50% of TOC removal and confirmed that the model is accordance with the experiments. In addition TOC removal for natural water based on response surface methodology optimum condition was 62.15%. This study showed that response surface methodology based on Box-Behnken method is a useful tool for optimizing the operating parameters for TOC removal using UV/H2O2 process.

  9. Evaluation methodology of a manipulator actuator for the dismantling process during nuclear decommissioning

    International Nuclear Information System (INIS)

    Park, Jongwon; Kim, Chang-Hoi; Jeong, Kyung-min; Choi, Byung-Seon; Moon, Jeikwon

    2016-01-01

    Highlights: • A methodology to evaluate actuators of a dismantling manipulator. • Evaluation criteria for choosing the most suitable actuator type. • A mathematical evaluation model for evaluation. • The evaluation method is expected to be used for determining other manipulators. - Abstract: This paper presents a methodology to evaluate actuators of a manipulator for dismantling nuclear power plants. Actuators are the most dominant components because a dismantling manipulator relies heavily on the actuator type used. To select the most suitable actuator, evaluation criteria are presented in four categories based on the nuclear dismantling environment. A mathematical model is presented and evaluation results are calculated with weights and scores for each criterion. The proposed evaluation method is expected to be used for determining other aspects of the design of dismantling manipulators.

  10. A cost effective waste management methodology for power reactor waste streams

    International Nuclear Information System (INIS)

    Granus, M.W.; Campbell, A.D.

    1984-01-01

    This paper describes a computer based methodology for the selection of the processing methods (solidification/dewatering) for various power reactor radwaste streams. The purpose of this methodology is to best select the method that provides the most cost effective solution to waste management. This method takes into account the overall cost of processing, transportation and disposal. The selection matrix on which the methodology is based is made up of over ten thousand combinations of liner, cask, process, and disposal options from which the waste manager can choose. The measurement device for cost effective waste management is the concurrent evaluation of total dollars spent. The common denominator is dollars per cubic foot of the input waste stream. Dollars per curie of the input waste stream provides for proper checks and balances. The result of this analysis can then be used to assess the total waste management cost. To this end, the methodology can then be employed to predict a given number of events (processes, transportation, and disposals) and project the annual cost of waste management. For the purposes of this paper, the authors provide examples of the application of the methodology on a typical BWR at 2, 4 and 6 years. The examples are provided in 1984 dollars. Process selection is influenced by a number of factors which must be independently evaluated for each waste stream. Final processing cost is effected by the particular process efficiency and a variety of regulatory constraints. The interface between process selection and cask selection/transportation driven by the goal of placing the greatest amount of pre-processed waste in the package and remaining within the bounds of weight, volume, regulatory, and cask availability limitations. Disposal is the cost of burial and can be affected by disposal, but availability of burial space, and the location of the disposal site in relation to the generator

  11. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  12. Presentation of a methodology for measuring social acceptance of three hydrogen storage technologies and preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Noirot, I.; Bigay, C. N.

    2005-07-01

    Hydrogen storage is a key technology for the extensive use of H2 as energy carrier. As none of the current technologies satisfies all of the hydrogen storage attributes required by manufacturers and end users, there is intense research works aiming at developing viable solutions. A broad objective of the StorHy European project is to provide technological storage solutions, which are attractive from an economical, environmental and safety point of view. A specific sub-project is dedicated to the comparison of three different potential storage technologies for transport applications (compressed gas, cryogenic liquid, solid media). This evaluation is carried out in a harmonised way, based on common tools and assessment strategies that could be useful for decision makers and stakeholders. The assessment is achieved in a 'sustainable development' spirit, taking into consideration the technical, environmental, economical, safety and social requirements. The latter ones have newly emerged in such evaluations, based on the Quality Function Deployment (QFD) approach, and would require to be further studied. Hydrogen acceptability studies have been conducted in previous projects. They have been reviewed by LBST in the AcceptH2 project Public acceptance of Hydrogen Transport Technologies : Analysis and comparisons of existing studies (www. accepth2. com - August 2003). During these hydrogen acceptance surveys, mainly fuel cell bus passengers from demonstration projects around the world have been questioned. The work presented in this paper goes further in the methodology refinement as it focuses on the evaluation of hydrogen storage solutions. It proposes a methodological tool for efficient social evaluation of new technologies and associated preliminary results concerning France. In a global approach to sustainable development, the CEA has developed a new methodology to evaluate its current research projects : Multicriteria Analysis for Sustainable Industrial

  13. SNMG: a social-level norm-based methodology for macro-governing service collaboration processes

    Science.gov (United States)

    Gao, Ji; Lv, Hexin; Jin, Zhiyong; Xu, Ping

    2017-08-01

    In order to adapt to the accelerative open tendency of collaborations between enterprises, this paper proposes a Social-level Norm-based methodology for Macro-Governing service collaboration processes, called SNMG, to regulate and control the social-level visible macro-behaviors of the social individuals participating in collaborations. SNMG not only can remove effectively the uncontrollability hindrance confronted with by open social activities, but also enables across-management-domain collaborations to be implemented by uniting the centralized controls of social individuals for respective social activities. Therefore, this paper provides a brand-new system construction mode to promote the development and large-scale deployment of service collaborations.

  14. Monitoring the International Standardization Process Theoretical Choices and Methodological Tools

    Directory of Open Access Journals (Sweden)

    Brigitte Juanals

    2012-08-01

    Full Text Available Many organizations are in charge of global security management. This paper outlines and argues for the construction of a theoretical and methodological framework in order to critically assess the new technopolitics currently being developed in the field of global security and which are materialized in standards. The main purpose is to design both a methodology and specific text mining tools to investigate these standards. These tools will be implemented in a platform designed to provide cartographic representations of standards and to assist the navigation of an end-user through a corpus of standards.

  15. Risk-Informed Assessment Methodology Development and Application

    International Nuclear Information System (INIS)

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-01-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  16. Optimization of process parameters for the inactivation of Lactobacillus sporogenes in tomato paste with ultrasound and {sup 60}Co-{gamma} irradiation using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ye Shengying [College of Food Science, South China Agricultural University, Wushan, Guangzhou, GD 510640 (China)], E-mail: yesy@scau.edu.cn; Qiu Yuanxin; Song Xianliang; Luo Shucan [College of Food Science, South China Agricultural University, Wushan, Guangzhou, GD 510640 (China)

    2009-03-15

    The processing parameters for ultrasound and {sup 60}Co-{gamma} irradiation were optimized for their ability to inactivate Lactobacillus sporogenes in tomato paste using a systematic experimental design based on response surface methodology. Ultrasonic power, ultrasonic processing time and irradiation dose were explored and a central composite rotation design was adopted as the experimental plan, and a least-squares regression model was obtained. The significant influential factors for the inactivation rate of L. sporogenes were obtained from the quadratic model and the t-test analyses for each process parameter. Confirmation of the experimental results indicated that the proposed model was reasonably accurate and could be used to describe the efficacy of the treatments for inactivating L. sporogenes within the limits of the factors studied. The optimized processing parameters were found to be an ultrasonic power of 120 W with a processing time of 25 min and an irradiation dose of 6.5 kGy. These were measured under the constraints of parameter limitation, based on the Monte Carlo searching method and the quadratic model of the response surface methodology, including the a/b value of the Hunter color scale of tomato paste. Nevertheless, the ultrasound treatment prior to irradiation for the inactivation of L. sporogenes in tomato paste was unsuitable for reducing the irradiation dose.

  17. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  18. Optimization of a High Temperature PEMFC micro-CHP System by Formulation and Application of a Process Integration Methodology

    DEFF Research Database (Denmark)

    Arsalis, Alexandros; Nielsen, Mads Pagh; Kær, Søren Knudsen

    2013-01-01

    A 1 kWe micro combined heat and power (CHP) system based on high temperature proton exchange membrane fuel cell (PEMFC) technology is modeled and optimized by formulation and application of a process integration methodology. The system can provide heat and electricity for a singlefamily household...

  19. Optimization of castor seed oil extraction process using response surface methodology

    Directory of Open Access Journals (Sweden)

    J. D. Mosquera-Artamonov

    2016-09-01

    Full Text Available This work focuses on the study of the oil extraction yield from castor seed using three different seed conditions: whole, minced and bare endosperm. Taguchi design was used to determine the contribution of the following parameters: seed condition, seed load in the extractor, temperature, and pressure. It was proved that it is necessary to introduce the whole seed and that the presence of the pericarp increases the extraction yield. The contribution of the control factors has an extraction yield limit. After determining which factors contributed to the process, these were left at their optimum levels aiming to reduce the control factors to only two. The complete analysis was done using a surface response methodology giving the best parameter for temperature and pressure that allows a better yielding mechanical extraction. The oil extraction yield can be kept up to 35% of the seed.

  20. A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases.

    Science.gov (United States)

    Pérez, Joaquín; Iturbide, Emmanuel; Olivares, Víctor; Hidalgo, Miguel; Martínez, Alicia; Almanza, Nelva

    2015-11-01

    It is known that the data preparation phase is the most time consuming in the data mining process, using up to 50% or up to 70% of the total project time. Currently, data mining methodologies are of general purpose and one of their limitations is that they do not provide a guide about what particular task to develop in a specific domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging because it was observed that the use of the methodology reduced some of the time consuming tasks and the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.

  1. Tracer methodology: an appropriate tool for assessing compliance with accreditation standards?

    Science.gov (United States)

    Bouchard, Chantal; Jean, Olivier

    2017-10-01

    Tracer methodology has been used by Accreditation Canada since 2008 to collect evidence on the quality and safety of care and services, and to assess compliance with accreditation standards. Given the importance of this methodology in the accreditation program, the objective of this study is to assess the quality of the methodology and identify its strengths and weaknesses. A mixed quantitative and qualitative approach was adopted to evaluate consistency, appropriateness, effectiveness and stakeholder synergy in applying the methodology. An online questionnaire was sent to 468 Accreditation Canada surveyors. According to surveyors' perceptions, tracer methodology is an effective tool for collecting useful, credible and reliable information to assess compliance with Qmentum program standards and priority processes. The results show good coherence between methodology components (appropriateness of the priority processes evaluated, activities to evaluate a tracer, etc.). The main weaknesses are the time constraints faced by surveyors and management's lack of cooperation during the evaluation of tracers. The inadequate amount of time allowed for the methodology to be applied properly raises questions about the quality of the information obtained. This study paves the way for a future, more in-depth exploration of the identified weaknesses to help the accreditation organization make more targeted improvements to the methodology. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. [Systemic inflammation: theoretical and methodological approaches to description of general pathological process model. Part 3. Backgroung for nonsyndromic approach].

    Science.gov (United States)

    Gusev, E Yu; Chereshnev, V A

    2013-01-01

    Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.

  3. A methodology for developing distributed programs

    NARCIS (Netherlands)

    Ramesh, S.; Mehndiratta, S.L.

    1987-01-01

    A methodology, different from the existing ones, for constructing distributed programs is presented. It is based on the well-known idea of developing distributed programs via synchronous and centralized programs. The distinguishing features of the methodology are: 1) specification include process

  4. A foundational methodology for determining system static complexity using notional lunar oxygen production processes

    Science.gov (United States)

    Long, Nicholas James

    This thesis serves to develop a preliminary foundational methodology for evaluating the static complexity of future lunar oxygen production systems when extensive information is not yet available about the various systems under consideration. Evaluating static complexity, as part of a overall system complexity analysis, is an important consideration in ultimately selecting a process to be used in a lunar base. When system complexity is higher, there is generally an overall increase in risk which could impact the safety of astronauts and the economic performance of the mission. To evaluate static complexity in lunar oxygen production, static complexity is simplified and defined into its essential components. First, three essential dimensions of static complexity are investigated, including interconnective complexity, strength of connections, and complexity in variety. Then a set of methods is developed upon which to separately evaluate each dimension. Q-connectivity analysis is proposed as a means to evaluate interconnective complexity and strength of connections. The law of requisite variety originating from cybernetic theory is suggested to interpret complexity in variety. Secondly, a means to aggregate the results of each analysis is proposed to create holistic measurement for static complexity using the Single Multi-Attribute Ranking Technique (SMART). Each method of static complexity analysis and the aggregation technique is demonstrated using notional data for four lunar oxygen production processes.

  5. Load shape development for Swedish commercial and public buildings - methodologies and results

    Energy Technology Data Exchange (ETDEWEB)

    Noren, C.

    1999-06-01

    The knowledge concerning electricity consumption, and especially load demand, in Swedish commercial buildings is very limited. The current study deals with methods for electricity consumption indicator development and application of the different methodologies on measured data. Typical load shapes and consumption indicators are developed for four different types of commercial buildings: schools, hotels, grocery stores and department stores. Two different methodologies for consumption indicator development are presented and discussed. The influence on load demand from different factors such as, installations, outdoor temperature and building activities is studied. It is suggested that building floor area is not an accurate determinant of building electricity consumption and it is necessary to consider other factors as those just mentioned to understand commercial building electricity consumption. The application of the two methodologies on measured data shows that typical load shapes can be developed with reasonable accuracy. For most of the categories it is possible to use the typical load shapes for approximation of whole-building load shapes with error rates about 10-25% depending on day-type and building type. Comparisons of the developed load shapes with measured data show good agreement 49 refs, 22 figs, 3 tabs

  6. Utility of radiotracer methodology in scientific research of industrial relevancy

    International Nuclear Information System (INIS)

    Kolar, Z.I.

    1990-01-01

    Utilization of radiotracer methodology in industrial research provides substantial scientific rather than directly demonstrable economic benefits. These benefits include better understanding of industrial processes and subsequently the development of new ones. Examples are given of the use of radiotracers in technological studies and the significance of the obtained results is put down. Creative application of radiotracer methodology may contribute to the economic development and technological advancement of all countries including the developing ones. (orig.) [de

  7. Development of the processing software package for RPV neutron fluence determination methodology

    International Nuclear Information System (INIS)

    Belousov, S.; Kirilova, K.; Ilieva, K.

    2001-01-01

    According to the INRNE methodology the neutron transport calculation is carried out by two steps. At the first step reactor core eigenvalue calculation is performed. This calculation is used for determination of the fixed source for the next step calculation of neutron transport from the reactor core to the RPV. Both calculation steps are performed by state of the art and tested codes. The interface software package DOSRC developed at INRNE is used as a link between these two calculations. The package transforms reactor core calculation results to neutron source input data in format appropriate for the neutron transport codes (DORT, TORT and ASYNT) based on the discrete ordinates method. These codes are applied for calculation of the RPV neutron flux and its responses - induced activity, radiation damage, neutron fluence etc. Fore more precise estimation of the neutron fluence, the INRNE methodology has been supplemented by the next improvements: - implementation of more advanced codes (PYTHIA/DERAB) for neutron-physics parameter calculations; - more detailed neutron source presentation; - verification of neutron fluence by statistically treated experimental data. (author)

  8. Vulnerability of sandy coasts to climate change and anthropic pressures: methodology and preliminary results

    Science.gov (United States)

    Idier, D.; Poumadère, M.; Vinchon, C.; Romieu, E.; Oliveros, C.

    2009-04-01

    1-INTRODUCTION Climate change is considered in the latest reports of the Intergovernmental Panel on Climate Change IPCC (2007) as unequivocal. Induced vulnerability of the system is defined as "the combination of sensitivity to climatic variations, probability of adverse effects, and adaptive capacity". Substantial methodological challenges remain, in particular estimating the risk of adverse climate change impacts and interpreting relative vulnerability across diverse situations. As stated by the IPCC, the "coastal systems should be considered vulnerable to changes in climate". In these areas, amongst the most serious impacts of sea-level rise (Nicholls, 1996) are erosion and marine inundation. Thus, the coast of metropolitan France, being composed of 31% sandy coasts, is potentially vulnerable, as it has been qualitatively assessed on the pilot coasts of Aquitaine and Languedoc-Roussillon in the RESPONSE project (Vinchon et al., 2008). Within the ANR VULSACO project (VULnerability of SAndy COast to climate change and anthropic pressure), the present day erosion tendencies as well as the potentially future erosion trends are investigated. The main objectives are to: (1) assess indicators of vulnerability to climate change for low-lying linear sandy coastal systems, from the shore to the hinterland, facing undergoing climate change and anthropic pressure until the 2030s; and (2) identify the aggravating or improving effect of human pressure on this vulnerability. This second issue is sometimes considered as a main driver of coastal risks. The methodology proposed in the project considers anthropic adaptation (or not) by putting decision makers in front of potential modifications of the physical system, to study the decision process and the choice of adaptation (or not). The coastal system is defined by its morphology, its physical characteristics and its land use. The time scales will range from short-term (days to weeks, e.g. time scale of extreme events) to

  9. Methodology of investment effectiveness evaluation in the local energy market

    Energy Technology Data Exchange (ETDEWEB)

    Kamrat, W.

    1999-07-01

    The paper presents issues of investment effectiveness evaluation in the local energy market. Results of research presented in the paper are mainly proposing a concept of a methodology which allows the evaluation of investment processes in regional power markets at the decision-making stage. In this respect, selecting a rational investment strategy is an important stage of the entire investment process. In view of criteria of various nature, the construction of a methodology of investment effectiveness bears an especially important meaning for a local decision-maker or investor. It is of particular significance to countries that are undergoing a transition from a centrally planned economy to a market economy. (orig.)

  10. Optimisation of wire-cut EDM process parameter by Grey-based response surface methodology

    Science.gov (United States)

    Kumar, Amit; Soota, Tarun; Kumar, Jitendra

    2018-03-01

    Wire electric discharge machining (WEDM) is one of the advanced machining processes. Response surface methodology coupled with Grey relation analysis method has been proposed and used to optimise the machining parameters of WEDM. A face centred cubic design is used for conducting experiments on high speed steel (HSS) M2 grade workpiece material. The regression model of significant factors such as pulse-on time, pulse-off time, peak current, and wire feed is considered for optimising the responses variables material removal rate (MRR), surface roughness and Kerf width. The optimal condition of the machining parameter was obtained using the Grey relation grade. ANOVA is applied to determine significance of the input parameters for optimising the Grey relation grade.

  11. Screening of Dementia in Portuguese Primary Care: Methodology, Assessment Tools, and Main Results

    Directory of Open Access Journals (Sweden)

    Laetitia Teixeira

    2017-11-01

    Full Text Available The objectives of this article are as follows: (1 to describe the assessment protocol used to outline people with probable dementia in Primary Health Care; (2 to show the methodological design and procedure to obtain a representative sample of patients with probable dementia; and (3 to report the main characteristics of the sample collected in the context of the study “Characteristics and needs of people with probable dementia.” The study protocol was based on the “Community Assessment of Risk and Treatment Strategies (CARTS Program” and is composed by a set of instruments that allow the assessment of older adults with probable dementia in several areas (health, psychological, functionality, and other. Descriptive analysis was used to characterize the final sample (n = 436. The study protocol as well as the methodological procedure to obtain the referral of research participants and data collection on the condition of people with probable dementia in Primary Health Care proved to be a valuable tool to obtain a sample of patients distributed by the full range of probable dementia in a large geographical area. Results may allocate the design of care pathways for old people with cognitive disorders to prevent, delay impairment, and/or optimize quality of life of patients.

  12. Childhood leukaemia near British nuclear installations: Methodological issues and recent results

    International Nuclear Information System (INIS)

    Bithell, J. F.; Keegan, T. J.; Kroll, M. E.; Murphy, M. F. G.; Vincent, T. J.

    2008-01-01

    In 2008, the German Childhood Cancer Registry published the results of the Kinderkrebs in der Umgebung von Kernkraftwerken (KiKK) study of childhood cancer and leukaemia around German nuclear power stations. The positive findings appeared to conflict with the results of a recent British analysis carried out by the Committee on Medical Aspects of Radiation in the Environment (COMARE), published in 2005. The present paper first describes the COMARE study, which was based on data from the National Registry of Children's Tumours (NRCT); in particular, the methodology used in this study is described. Although the results of the COMARE study were negative for childhood leukaemia, this apparent discrepancy could be accounted for by a number of differences in approach, especially those relating to the distances from the power stations and the ages of the children studied. The present study was designed to match the KiKK study as far as possible. The incidence observed (18 cases within 5 km against 14.58 expected, p = 0.21) was not significantly raised. The risk estimate for proximity in the regression fitted was actually negative, though the confidence intervals involved are so wide that the difference from that reported in the KiKK study is only marginally statistically significant (p = 0.063). (authors)

  13. BPLOM: BPM Level-Oriented Methodology for Incremental Business Process Modeling and Code Generation on Mobile Platforms

    Directory of Open Access Journals (Sweden)

    Jaime Solis Martines

    2013-06-01

    Full Text Available The requirements engineering phase is the departure point for the development process of any kind of computer application, it determines the functionality needed in the working scenario of the program. Although this is a crucial point in application development, as incorrect requirement definition leads to costly error appearance in later stages of the development process, application domain experts’ implication remains minor. In order to correct this scenario, business process modeling notations were introduced to favor business expert implication in this phase, but notation complexity prevents this participation to reach its ideal state. Hence, we promote the definition of a level oriented business process methodology, which encourages the adaptation of the modeling notation to the modeling and technical knowledge shown by the expert. This approach reduces the complexity found by domain experts and enables them to model their processes completely with a level of technical detail directly proportional to their knowledge.

  14. EMS and process of identification and evaluation of environmental aspects: a proposal methodology

    International Nuclear Information System (INIS)

    Perotto, E.

    2006-01-01

    The Environmental Management System (EMS) is an instrument to manage the interaction between the organization and the environment. The scope od EMS is to reduce the environmental impact and to achieve improvements in overall performances. In particular, the focus point of EMS implementation is the method for identifying and assessing significant environmental aspects. The results of the literature and regulation reviews (Perotto 2006) have shown that rigourous repeatable and transparent methodologies do not exist. This paper presents a proposal method for identifying and assessing significant environmental aspects, that has all three of these important characteristics. In particular, the proposal methodology for assessing aspects is based on some criteria that are combined in a specific algorithm. It is important to specify that to make a correct application of the method a preliminary rigorous approach to investigating the environment and the activities of organizations is necessary [it

  15. External Costs and Benefits of Energy. Methodologies, Results and Effects on Renewable Energies Competitivity

    International Nuclear Information System (INIS)

    Saez, R.; Cabal, H.; Varela, M.

    1999-01-01

    This study attempts to give a summarised vision of the concept of eternality in energy production, the social and economic usefulness of its evaluation and consideration as support to the political decision-marking in environmental regulation matters, technologies selection of new plants, priorities establishment on energy plans, etc. More relevant environmental externalisation are described, as are the effects on the health, ecosystems, materials and climate, as well as some of the socioeconomic externalisation such as the employment, increase of the GDP and the reduction and depletion of energy resources. Different methodologies used during the last years have been reviewed as well as the principals resulted obtained in the most relevant studies accomplished internationally on this topic. Special mention has deserved the European study National Implementation of the Extern E Methodology in the EU . Results obtained are represented in Table 2 of this study. Also they are exposed, in a summarised way, the results obtained in the evaluation of environmental externalisation of the Spanish electrical system in function of the fuel cycle. In this last case the obtained results are more approximated since have been obtained by extrapolation from the obtained for ten representative plants geographically distributed trough the Peninsula. Finally it has been analysed the influence that the internalization of the external costs of conventional energies can have in the competitiveness and in te market of renewable energy, those which originate less environmental effects and therefore produce much smaller external costs. The mechanisms of internalization and the consideration on the convenience or not of their incorporation in the price of energy have been also discussed. (Author) 30 refs

  16. Challenges and Opportunities for Harmonizing Research Methodology: Raw Accelerometry.

    Science.gov (United States)

    van Hees, Vincent T; Thaler-Kall, Kathrin; Wolf, Klaus-Hendrik; Brønd, Jan C; Bonomi, Alberto; Schulze, Mareike; Vigl, Matthäus; Morseth, Bente; Hopstock, Laila Arnesdatter; Gorzelniak, Lukas; Schulz, Holger; Brage, Søren; Horsch, Alexander

    2016-12-07

    Raw accelerometry is increasingly being used in physical activity research, but diversity in sensor design, attachment and signal processing challenges the comparability of research results. Therefore, efforts are needed to harmonize the methodology. In this article we reflect on how increased methodological harmonization may be achieved. The authors of this work convened for a two-day workshop (March 2014) themed on methodological harmonization of raw accelerometry. The discussions at the workshop were used as a basis for this review. Key stakeholders were identified as manufacturers, method developers, method users (application), publishers, and funders. To facilitate methodological harmonization in raw accelerometry the following action points were proposed: i) Manufacturers are encouraged to provide a detailed specification of their sensors, ii) Each fundamental step of algorithms for processing raw accelerometer data should be documented, and ideally also motivated, to facilitate interpretation and discussion, iii) Algorithm developers and method users should be open about uncertainties in the description of data and the uncertainty of the inference itself, iv) All new algorithms which are pitched as "ready for implementation" should be shared with the community to facilitate replication and ongoing evaluation by independent groups, and v) A dynamic interaction between method stakeholders should be encouraged to facilitate a well-informed harmonization process. The workshop led to the identification of a number of opportunities for harmonizing methodological practice. The discussion as well as the practical checklists proposed in this review should provide guidance for stakeholders on how to contribute to increased harmonization.

  17. The use of fracture mechanics methodologies for NDT results evaluation and comparison

    International Nuclear Information System (INIS)

    Reale, S.

    1995-01-01

    In the general frame of analysing the interactions amongst the information from non destructive evaluation (NDE) and the methodologies to assess the integrity of a defective structure (such as fracture mechanics), the aim of the paper is to analyse and compare, in terms of indices related to safety margins, NDE results from round robin testing trials to acheive assessments of capabilities and limitations.A structural integrity/fracture mechanics approach for evaluating and comparing results from non destructive techniques is presented. Safety factors can be associated to flaws detected and characterized by inspections (estimated flaws) and to flaws actually present (reference flaws). The mismatch between safety factors associated to estimated flaws and safety factors associated to reference flaws can be used to assess capabilities and limitations of procedures and techniques in use for inspections.As an example, to show how the above procedure is applied and its potential as a method of data evaluation and comparison, the NDE results produced by the PISC (project for the inspection of steel components) activity have been considered. (orig.)

  18. Concept Generation for Design Creativity A Systematized Theory and Methodology

    CERN Document Server

    Taura, Toshiharu

    2013-01-01

    The concept generation process seems like an intuitional thought: difficult to capture and perform, although everyone is capable of it. It is not an analytical process but a synthetic process which has yet to be clarified. Furthermore, new research methods for investigating the concept generation process—a very difficult task since the concept generation process is driven by inner feelings deeply etched in the mind—are necessary to establish its theory and methodology.  Concept Generation for Design Creativity—A Systematized Theory and Methodology presents the concept generation process both theoretically and methodologically. Theoretically, the concept generation process is discussed by comparing metaphor, abduction, and General Design Theory from the perspective of similarities and dissimilarities. Property mapping, concept blending, and concept integration in thematic relation have been explained methodologically. So far, these theories and methods have been discussed independently, and the relation...

  19. Methodology for the development and the UML (unified modified language) simulation of data acquisition and data processing systems dedicated to high energy physics experiments

    International Nuclear Information System (INIS)

    Anvar, S.

    2002-09-01

    The increasing complexity of the real-time data acquisition and processing systems (TDAQ: the so called Trigger and Data AcQuisition systems) in high energy physics calls for an appropriate evolution of development tools. This work is about the interplay between in principle specifications of TDAQ systems and their actual design and realization on a concrete hardware and software platform. The basis of our work is to define a methodology for the development of TDAQ systems that meets the specific demands for the development of such systems. The result is the detailed specification of a 'methodological framework' based on the Unified Modeling Language (UML) and designed to manage a development process. The use of this UML-based methodological framework progressively leads to the setting up of a 'home-made' framework, i.e. a development tool that comprises reusable components and generic architectural elements adapted to TDAQ systems. The main parts of this dissertation are sections II to IV. Section II is devoted to the characterization and evolution of TDAQ systems. In section III, we review the main technologies that are relevant to our problematic, namely software reuse techniques such as design patterns and frameworks, especially concerning the real-time and embedded systems domain. Our original conceptual contribution is presented in section IV, where we give a detailed, formalized and example-driven specification of our development model. Our final conclusions are presented in section V, where we present the MORDICUS project devoted to a concrete realization of our UML methodological framework, and the deep affinities between our work and the emerging 'Model Driven Architecture' (MDA) paradigm developed by the Object Management Group. (author)

  20. Relating Reasoning Methodologies in Linear Logic and Process Algebra

    Directory of Open Access Journals (Sweden)

    Yuxin Deng

    2012-11-01

    Full Text Available We show that the proof-theoretic notion of logical preorder coincides with the process-theoretic notion of contextual preorder for a CCS-like calculus obtained from the formula-as-process interpretation of a fragment of linear logic. The argument makes use of other standard notions in process algebra, namely a labeled transition system and a coinductively defined simulation relation. This result establishes a connection between an approach to reason about process specifications and a method to reason about logic specifications.

  1. D4.1 Learning analytics: theoretical background, methodology and expected results

    NARCIS (Netherlands)

    Tammets, Kairit; Laanpere, Mart; Eradze, Maka; Brouns, Francis; Padrón-Nápoles, Carmen; De Rosa, Rosanna; Ferrari, Chiara

    2014-01-01

    The purpose of the EMMA project is to showcase excellence in innovative teaching methodologies and learning approaches through the large-scale piloting of MOOCs on different subjects. The main objectives related with the implementation of learning analytics in EMMa project are to: ● develop the

  2. The Key Principles of Process Manager Motivation in Production and Administration Processes in an Industrial Enterprise

    Directory of Open Access Journals (Sweden)

    Chromjakova Felicita

    2016-03-01

    Full Text Available The basic premise of sustainable development is that companies should completely re-evaluate their enterprise work logic and process organization. Most of the necessary changes concern employee stimulation and motivation. If we are truly interested in improving business results and the effectiveness of business processes – there would be no progress otherwise – we have to strive to break down the barriers between company management (leadership and employees in order to establish effective relationships between firms and customers. This paper presents research results of process manager activities in modern industrial enterprises, connected with a methodology proposal for the systematically-oriented process manager motivation of employees in accordance with the increased competitiveness of production and administration processes. It also presents an effective methodology of how to increase the positive effects of welldefined employee motivations from the process manager´s perspective. The core benefit of this methodology lies in the design of a systematic approach to the motivation process from the process manager side, allowing for radical performance improvement via production and administrative processes and the increased competitiveness of enterprise processes.

  3. Application of theoretical and methodological components of nursing care

    Directory of Open Access Journals (Sweden)

    Rosa del Socorro Morales-Aguilar

    2016-12-01

    Full Text Available Introduction: the theoretical and methodological components are the proper expertise in nursing, and it refers to models, theories, care process, taxonomy of nursing diagnoses, system of nursing intervention classification, and system of outcomes classification, which base nursing care into professional practice. Methodology: research was performed on Google Scholar, reviewing the databases of Scielo, Ciberindex, Index Enfermería, Dialnet, Redalyc, Medline, identifying 70 published articles between 2005-2015, and selecting 52 of them. The keywords used were: nurse care, nursing diagnostic, classification, nursing theory, in spanish and portuguese. Results: training students, receive knowledge in the nursing process, NANDA International, classification of the interventions, nurse results and theoretical components. The Dorothea Orem, Callista Roy, Nola Pender, Virginia Henderson, Florence Nightingale, and Betty Neuman theories are applied. The application of the nursing process is limited and low familiarity with the international taxonomy by nurse professionals in the assistance area is noticed. Conclusions: the challenge of nursing is to continue to solidify the scientific knowledge and to undo the gap between theory and practice.

  4. Investigation on the Effects of Process Parameters on Laser Percussion Drilling Using Finite Element Methodology; Statistical Modelling and Optimization

    Directory of Open Access Journals (Sweden)

    Mahmoud Moradi

    Full Text Available Abstract In the present research, the simulation of the Nickel-base superalloy Inconel 718 fiber-laser drilling process with the thickness of 1mm is investigated through the Finite Element Method. In order to specify the appropriate Gaussian distribution of laser beam, the results of an experimental research on glass laser drilling were simulated using three types of Gaussian distribution. The DFLUX subroutine was used to implement the laser heat sources of the models using the Fortran language. After the appropriate Gaussian distribution was chosen, the model was validated with the experimental results of the Nickel-base superalloy Inconel 718 laser drilling process. The negligible error percentage among the experimental and simulation results demonstrates the high accuracy of this model. The experiments were performed based on the Response Surface Methodology (RSM as a statistical design of experiment (DOE approach to investigate the influence of process parameters on the responses, obtaining the mathematical regressions and predicting the new results. Four parameters i.e. laser pulse frequency (150 to 550 Hz, laser power (200 to 500 watts, laser focal plane position (-0.5 to +0.5 mm and the duty cycle (30 to 70% were considered to be the input variables in 5 levels and four external parameters i.e. the hole's entrance and exit diameters, hole taper angle and the weight of mass removed from the hole, were observed to be the process output responses of this central composite design. By performing the statistical analysis, the input and output parameters were found to have a direct relation with each other. By an increase in each of the input variables, the entrance and exit hole diameters, the hole taper angel, and the weight of mass removed from the hole increase. Finally, the results of the conducted simulations and statistical analyses having been used, the laser drilling process was optimized by means of the desire ability approach. Good

  5. Virtual enterprise architecture and methodology - Initial results from the Globeman21 project

    DEFF Research Database (Denmark)

    Vesterager, Johan; Larsen, Lars Bjørn; Gobbi, Chiara

    1999-01-01

    This paper will focus on presenting the initial results from the IMS project Globeman21 regarding generic models for Extended Enterprise Management (EEM). In particular the paper outlines a proposed architecture for the creation of virtual enterprises, industrial requirements regarding the generic...... models, terminology for describing extended enterprises, and initial considerations regarding a methodology for EEM. Globeman21 see the extended enterprise as a concept covering the totality of different concepts dealing with the expansion or extension of enterprise activities. One way of realising...... the concept of extended enterprise is through the creation of virtual enterprise, based on a more or less formalised network. This approach is the basis for the development of the generic EEM model within Globeman21....

  6. Use of cesium-137 methodology in the evaluation of superficial erosive processes

    International Nuclear Information System (INIS)

    Andrello, Avacir Casanova; Appoloni, Carlos Roberto; Guimaraes, Maria de Fatima; Nascimento Filho, Virgilio Franco do

    2003-01-01

    Superficial erosion is one of the main soil degradation agents and erosion rates estimations for different edaphic climate conditions for the conventional models, as USLE and RUSLE, are expensive and time-consuming. The use of cesium- 137 anthropogenic radionuclide is a new methodology that has been much studied and its application in the erosion soil evaluation has grown in countries as USA, UK, Australia and others. A brief narration of this methodology is being presented, as the development of the equations utilized for the erosion rates quantification through the cesium- 137 measurements. Two watersheds studied in Brazil have shown that the cesium- 137 methodology was practicable and coherent with the survey in field for applications in erosion studies. (author)

  7. ''Training plan optimized design'' methodology application to IBERDROLA - Power generation

    International Nuclear Information System (INIS)

    Gil, S.; Mendizabal, J.L.

    1996-01-01

    The trend in both Europe and the United States, towards the understanding that no training plan may be considered suitable if not backed by the results of application of the S.A.T. (Systematic Approach to Training) methodology, led TECNATOM, S.A. to apply thy methodology through development of an application specific to the conditions of the Spanish working system. The requirement that design of the training be coherent with the realities of the working environment is met by systematic application of the SAT methodology as part of the work analysis and job-based task analysis processes, this serving as a basis for design of the training plans

  8. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  9. Optimization of Laser Transmission Joining Process Parameters on Joint Strength of PET and 316 L Stainless Steel Joint Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Shashi Prakash Dwivedi

    2014-01-01

    Full Text Available The objective of the present work is to study the effects of laser power, joining speed, and stand-off distance on the joint strength of PET and 316 L stainless steel joint. The process parameters were optimized using response methodology for achieving good joint strength. The central composite design (CCD has been utilized to plan the experiments and response surface methodology (RSM is employed to develop mathematical model between laser transmission joining parameters and desired response (joint strength. From the ANOVA (analysis of variance, it was concluded that laser power is contributing more and it is followed by joining speed and stand-off distance. In the range of process parameters, the result shows that laser power increases and joint strength increases. Whereas joining speed increases, joint strength increases. The joint strength increases with the increase of the stand-off distance until it reaches the center value; the joint strength then starts to decrease with the increase of stand-off distance beyond the center limit. Optimum values of laser power, joining speed, and stand-off distance were found to be 18 watt, 100 mm/min, and 2 mm to get the maximum joint strength (predicted: 88.48 MPa. There was approximately 3.37% error in the experimental and modeled results of joint strength.

  10. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  11. Assessment of public acceptability in site selection process. The methodology and the results

    International Nuclear Information System (INIS)

    Zeleznik, N.; Kralj, M.; Polic, M.; Kos, D.; Pek Drapal, D.

    2005-01-01

    The site selection process for the low and intermediate radioactive waste (LILW) repository in Slovenia follows the mixed mode approach according to the model proposed by IAEA. After finishing the conceptual and planning stage in 1999, and after identification of the potentially suitable areas in the area survey stage in 2001, ARAO (Agency for radwaste management) invited all municipalities to volunteer in the procedure of placing the LILW repository in the physical environment. A positive response was received from eight municipalities, though three municipalities later resigned from it. A selection between twelve locations in these five municipalities had to be done because Slovenian procedure provides for only three locations to be further evaluated in the stage of identification of potentially suitable sites. A pre-feasibility study of the public acceptability, together with the technical aspects (safety, technical functionality, economic, environmental and spatial aspects) was performed. The aspect of public acceptability included objective and subjective evaluation criteria. The former included information obtained from studies of demography, data on local economy, infrastructure and eventual environmental problems, media analysis, and earlier public opinion polls. The latter included data obtained from topical workshops, free phone line, telephone interviews with the general public and personal interviews with representatives of decision makers and public opinion leaders, as well as a public opinion poll in all included communities. Evaluated municipalities were ranked regarding their social suitability for the radioactive waste site. (author)

  12. Optimization of biodiesel production process for mixed Jatropha curcas–Ceiba pentandra biodiesel using response surface methodology

    International Nuclear Information System (INIS)

    Dharma, S.; Masjuki, H.H.; Ong, Hwai Chyuan; Sebayang, A.H.; Silitonga, A.S.; Kusumo, F.; Mahlia, T.M.I.

    2016-01-01

    Highlights: • Jatropha curcas and Ceiba pentandra are potential feedstock for biodiesel. • Optimization of biodiesel production by response surface methodology. • Jatropha curcas–Ceiba pentandra mixed biodiesel yield was 93.33%. • The properties of mixed biodiesel fulfill ASTM (D6751) standard. - Abstract: Exploring and improvement of biodiesel production from non-edible vegetable oil is one of the effective ways to solve limited amount of traditional raw materials and their high prices. The main objective of this study is to optimize the biodiesel production process parameters (methanol-to-oil ratio, agitation speed and concentration of the potassium hydroxide catalyst) of a biodiesel derived from non-edible feedstocks, namely Jatropha curcas and Ceiba pentandra, using response surface methodology based on Box–Behnken experimental design. Based on the results, the optimum operating parameters for transesterification of the J50C50 oil mixture at 60 °C over a period of 2 h are as follows: methanol-to-oil ratio: 30%, agitation speed: 1300 rpm and catalyst concentration: 0.5 wt.%. These optimum operating parameters gives the highest yield for the J50C50 biodiesel with a value of 93.33%. The results show that there is a significant improvement in the physicochemical properties of the J50C50 biodiesel after optimization, whereby the kinematic viscosity at 40 °C, density at 15 °C, calorific value, acid value and oxidation stability is 3.950 mm"2/s, 831.2 kg/m"3, 40.929 MJ/kg, 0.025 mg KOH/g and 10.01 h, respectively. The physicochemical properties of the optimized J50C50 biodiesel fulfill the requirements given in the ASTM (D6751) and (EN14214) standards.

  13. Solid Waste Management Planning--A Methodology

    Science.gov (United States)

    Theisen, Hilary M.; And Others

    1975-01-01

    This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)

  14. Methodology for studying social advertising: A sociological aspect

    Directory of Open Access Journals (Sweden)

    S B Kalmykov

    2014-12-01

    Full Text Available The article describes the author’s dynamic processual methodology for the sociological study of social advertising that combines the multiversion paradigmatic approach, legitimization procedures, methodological principles of interconnection, multilevel analysis and the principles of sociological data formalization developed by P. Lazarsfeld. The author explains the multi-stage strategy of the methodology and the research procedures that provide new sociological knowledge about the processes of social advertising. The first stage involves analysis of the social advertising as a number of institutional, communicative, socio-cultural and socio-technological processes. The second stage consists of the development of the substantive aspects of social advertising dynamics and its dependence on the features of different socio-demographic groups. The third stage of the methodology includes a comparative analysis of the social advertising theoretical and empirical aspects and the subsequent assessment of its fundamental and applied capabilities. The author identifies two types of research practices: the first one consists of three levels of complexity - the first one is to design the social advertising categories and concepts; the second one requires a higher level of generalization; the third one supposes justification of the universal categorization and the social advertising conceptualization for different social areas as well as a comparative analysis of the theory of the social advertising impact developed by O.O. Savel’eva with the research results for the aims of the promotion of the sociology of advertising. The article concludes with the demonstration of the proposed methodology universality for different spheres of social reality.

  15. Q methodology, risk training and quality management.

    Science.gov (United States)

    McKeown, M; Hinks, M; Stowell-Smith, M; Mercer, D; Forster, J

    1999-01-01

    The results of a Q methodological study of professional understandings of the notion of risk in mental health services within the UK are discussed in relation to the relevance for staff training and quality assurance. The study attempted to access the diversity of understandings of risk issues amongst a multi-professional group of staff (n = 60) attending inter-agency risk training workshops in 1998. Q methodology is presented as both an appropriate means for such inquiry and as a novel experiential technique for training purposes. A tentative argument is advanced that the qualitative accounts generated by Q research could assist in systematic reviews of quality, complementing the singularly quantitative approaches typically represented in the audit process.

  16. Application of PRINCE2 Project Management Methodology

    Directory of Open Access Journals (Sweden)

    Vaníčková Radka

    2017-09-01

    Full Text Available The methodology describes the principle of setting a project in PRINCE2 project management. The main aim of the paper is to implement PRINCE2 methodology to be used in an enterprise in the service industry. A partial aim is to choose a supplier of the project among new travel guides. The result of the project activity is a sight-seeing tour/service more attractive for customers in the tourism industry and a possible choice of new job opportunities. The added value of the article is the description of applying the principles, processes and topics of PRINCE2 project management so that they might be used in the field.

  17. Revised INPRO Methodology in the Area of Proliferation Resistance

    International Nuclear Information System (INIS)

    Park, J.H.; Lee, Y.D.; Yang, M.S.; Kim, J.K.; Haas, E.; Depisch, F.

    2008-01-01

    The official INPRO User Manual in the area of proliferation resistance is being processed for the evaluation of innovative nuclear energy systems. Proliferation resistance is one of the goals to be satisfied for future nuclear energy systems in INPRO. The features of currently updated and released INPRO methodology were introduced on basic principles, user requirements and indicators. The criteria for an acceptance limit were specified. The DUPIC fuel cycle was evaluated based on the updated INPRO methodology for the applicability of the INPRO User Manual. However, the INPRO methodology has some difficulty in quantifying the multiplicity and robustness as well as the total cost to improve proliferation resistance. Moreover, the integration method for the evaluation results still needs to be improved.

  18. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  19. System assessment using modular logic fault tree methodology

    International Nuclear Information System (INIS)

    Troncoso Fleitas, M.

    1996-01-01

    In the process of a Probabilistic Safety analysis (PSA) study a large number of fault trees are generated by different specialist. Modular Logic Fault Tree Methodology pave the way the way to systematize the procedures and to unify the criteria in the process of systems modulation. An example of of the application of this methodology is shown

  20. Optimisation of process parameters on thin shell part using response surface methodology (RSM) and genetic algorithm (GA)

    Science.gov (United States)

    Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study conducts the simulation on optimisation of injection moulding process parameters using Autodesk Moldflow Insight (AMI) software. This study has applied some process parameters which are melt temperature, mould temperature, packing pressure, and cooling time in order to analyse the warpage value of the part. Besides, a part has been selected to be studied which made of Polypropylene (PP). The combination of the process parameters is analysed using Analysis of Variance (ANOVA) and the optimised value is obtained using Response Surface Methodology (RSM). The RSM as well as Genetic Algorithm are applied in Design Expert software in order to minimise the warpage value. The outcome of this study shows that the warpage value improved by using RSM and GA.

  1. Methodology to remediate a mixed waste site

    International Nuclear Information System (INIS)

    Berry, J.B.

    1994-08-01

    In response to the need for a comprehensive and consistent approach to the complex issue of mixed waste management, a generalized methodology for remediation of a mixed waste site has been developed. The methodology is based on requirements set forth in the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and the Resource Conservation and Recovery Act (RCRA) and incorporates ''lessons learned'' from process design, remediation methodologies, and remediation projects. The methodology is applied to the treatment of 32,000 drums of mixed waste sludge at the Oak Ridge K-25 Site. Process technology options are developed and evaluated, first with regard to meeting system requirements and then with regard to CERCLA performance criteria. The following process technology options are investigated: (1) no action, (2) separation of hazardous and radioactive species, (3) dewatering, (4) drying, and (5) solidification/stabilization. The first two options were eliminated from detailed consideration because they did not meet the system requirements. A quantitative evaluation clearly showed that, based on system constraints and project objectives, either dewatering or drying the mixed waste sludge was superior to the solidification/stabilization process option. The ultimate choice between the drying and the dewatering options will be made on the basis of a technical evaluation of the relative merits of proposals submitted by potential subcontractors

  2. Methodology to remediate a mixed waste site

    Energy Technology Data Exchange (ETDEWEB)

    Berry, J.B.

    1994-08-01

    In response to the need for a comprehensive and consistent approach to the complex issue of mixed waste management, a generalized methodology for remediation of a mixed waste site has been developed. The methodology is based on requirements set forth in the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and the Resource Conservation and Recovery Act (RCRA) and incorporates ``lessons learned`` from process design, remediation methodologies, and remediation projects. The methodology is applied to the treatment of 32,000 drums of mixed waste sludge at the Oak Ridge K-25 Site. Process technology options are developed and evaluated, first with regard to meeting system requirements and then with regard to CERCLA performance criteria. The following process technology options are investigated: (1) no action, (2) separation of hazardous and radioactive species, (3) dewatering, (4) drying, and (5) solidification/stabilization. The first two options were eliminated from detailed consideration because they did not meet the system requirements. A quantitative evaluation clearly showed that, based on system constraints and project objectives, either dewatering or drying the mixed waste sludge was superior to the solidification/stabilization process option. The ultimate choice between the drying and the dewatering options will be made on the basis of a technical evaluation of the relative merits of proposals submitted by potential subcontractors.

  3. Cost estimating for CERCLA remedial alternatives a unit cost methodology

    International Nuclear Information System (INIS)

    Brettin, R.W.; Carr, D.J.; Janke, R.J.

    1995-06-01

    The United States Environmental Protection Agency (EPA) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA, Interim Final, dated October 1988 (EPA 1988) requires a detailed analysis be conducted of the most promising remedial alternatives against several evaluation criteria, including cost. To complete the detailed analysis, order-of-magnitude cost estimates (having an accuracy of +50 percent to -30 percent) must be developed for each remedial alternative. This paper presents a methodology for developing cost estimates of remedial alternatives comprised of various technology and process options with a wide range of estimated contaminated media quantities. In addition, the cost estimating methodology provides flexibility for incorporating revisions to remedial alternatives and achieves the desired range of accuracy. It is important to note that the cost estimating methodology presented here was developed as a concurrent path to the development of contaminated media quantity estimates. This methodology can be initiated before contaminated media quantities are estimated. As a result, this methodology is useful in developing cost estimates for use in screening and evaluating remedial technologies and process options. However, remedial alternative cost estimates cannot be prepared without the contaminated media quantity estimates. In the conduct of the feasibility study for Operable Unit 5 at the Fernald Environmental Management Project (FEMP), fourteen remedial alternatives were retained for detailed analysis. Each remedial alternative was composed of combinations of remedial technologies and processes which were earlier determined to be best suited for addressing the media-specific contaminants found at the FEMP site, and achieving desired remedial action objectives

  4. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    Science.gov (United States)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  5. Methodology for Automatic Ontology Generation Using Database Schema Information

    Directory of Open Access Journals (Sweden)

    JungHyen An

    2018-01-01

    Full Text Available An ontology is a model language that supports the functions to integrate conceptually distributed domain knowledge and infer relationships among the concepts. Ontologies are developed based on the target domain knowledge. As a result, methodologies to automatically generate an ontology from metadata that characterize the domain knowledge are becoming important. However, existing methodologies to automatically generate an ontology using metadata are required to generate the domain metadata in a predetermined template, and it is difficult to manage data that are increased on the ontology itself when the domain OWL (Ontology Web Language individuals are continuously increased. The database schema has a feature of domain knowledge and provides structural functions to efficiently process the knowledge-based data. In this paper, we propose a methodology to automatically generate ontologies and manage the OWL individual through an interaction of the database and the ontology. We describe the automatic ontology generation process with example schema and demonstrate the effectiveness of the automatically generated ontology by comparing it with existing ontologies using the ontology quality score.

  6. Assessing health in an urban neighborhood: community process, data results and implications for practice.

    Science.gov (United States)

    Idali Torres, M

    1998-06-01

    This article examines the community process and data results of a health assessment conducted in an urban neighborhood of a middle-size city in Western Massachusetts. It describes the four-stage development process of the Health Assessment Project (HAP), a collaboration of the UMASS School of Public Health faculty and students, community based organizations and youth residents: (1) planning with a contemporary participatory approach, (2) implementing the data collection with traditional survey methodology, (3) tailoring the data analysis for a presentation at a community forum and report, and (4) incorporating the community's reaction to data results. In addition, it presents selected data results on health conditions of individual household members and perceived community health concerns and resources. Salient data results include high rates of chronic health conditions such as asthma and other respiratory problems among residents 0-18, back pain and other musculoskeletal among younger adults 19-54, and high blood pressure and other cardi-circulatory problems among older adults age 55 and older. The three most prevalent perceived community concerns are substance abuse, gangs and drug dealing. Identified community resources include sources of (1) providers of primary care, (2) health information as family/friends and Spanish media, (3) social activity such as churches and schools. Finally, this paper concludes by discussing implications for community health practice.

  7. Methodology for identifying boundaries of systems important to safety in CANDU nuclear power plants

    International Nuclear Information System (INIS)

    Therrien, S.; Komljenovic, D.; Therrien, P.; Ruest, C.; Prevost, P.; Vaillancourt, R.

    2007-01-01

    This paper presents a methodology developed to identify the boundaries of the systems important to safety (SIS) at the Gentilly-2 Nuclear Power Plant (NPP), Hydro-Quebec. The SIS boundaries identification considers nuclear safety only. Components that are not identified as important to safety are systematically identified as related to safety. A global assessment process such as WANO/INPO AP-913 'Equipment Reliability Process' will be needed to implement adequate changes in the management rules of those components. The paper depicts results in applying the methodology to the Shutdown Systems 1 and 2 (SDS 1, 2), and to the Emergency Core Cooling System (ECCS). This validation process enabled fine tuning the methodology, performing a better estimate of the effort required to evaluate a system, and identifying components important to safety of these systems. (author)

  8. Mo(ve)ment-methodology

    DEFF Research Database (Denmark)

    Mørck, Line Lerche; Christian Celosse-Andersen, Martin

    2018-01-01

    This paper describes the theoretical basis for and development of a moment-movement research methodology, based on an integration of critical psychological practice research and critical ethnographic social practice theory. Central theoretical conceptualizations, such as human agency, life...... conditions and identity formation, are discussed in relation to criminological theories of gang desistance. The paper illustrates how the mo(ve)ment methodology was applied in a study of comprehensive processes of identity (re)formation and gang exit processes. This study was conducted with Martin, a former....... This is a moment which captures Martin’s complex and ambiguous feelings of conflictual concerns, frustration, anger, and a new feeling of insecurity in his masculinity, as well as engagement and a sense of deep meaningfulness as he becomes a more reflective academic. All these conflicting feelings also give...

  9. Optimization of Process Variables for Insulation Coating of Conductive Particles by Response Surface Methodology

    International Nuclear Information System (INIS)

    Sim, Chol-Ho

    2016-01-01

    The powder core, conventionally fabricated from iron particles coated with insulator, showed large eddy current loss under high frequency, because of small specific resistance. To overcome the eddy current loss, the increase in the specific resistance of powder cores was needed. In this study, copper oxide coating onto electrically conductive iron particles was performed using a planetary ball mill to increase the specific resistance. Coating factors were optimized by the Response surface methodology. The independent variables were the CuO mass fraction, mill revolution number, coating time, ball size, ball mass and sample mass. The response variable was the specific resistance. The optimization of six factors by the fractional factorial design indicated that CuO mass fraction, mill revolution number, and coating time were the key factors. The levels of these three factors were selected by the three-factors full factorial design and steepest ascent method. The steepest ascent method was used to approach the optimum range for maximum specific resistance. The Box-Behnken design was finally used to analyze the response surfaces of the screened factors for further optimization. The results of the Box-Behnken design showed that the CuO mass fraction and mill revolution number were the main factors affecting the efficiency of coating process. As the CuO mass fraction increased, the specific resistance increased. In contrast, the specific resistance increased with decreasing mill revolution number. The process optimization results revealed a high agreement between the experimental and the predicted data (Adj-R2=0.944). The optimized CuO mass fraction, mill revolution number, and coating time were 0.4, 200 rpm, and 15 min, respectively. The measured value of the specific resistance of the coated pellet under the optimized conditions of the maximum specific resistance was 530 kΩ·cm

  10. Optimization of Process Variables for Insulation Coating of Conductive Particles by Response Surface Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Chol-Ho [Sangji University, Wonju (Korea, Republic of)

    2016-02-15

    The powder core, conventionally fabricated from iron particles coated with insulator, showed large eddy current loss under high frequency, because of small specific resistance. To overcome the eddy current loss, the increase in the specific resistance of powder cores was needed. In this study, copper oxide coating onto electrically conductive iron particles was performed using a planetary ball mill to increase the specific resistance. Coating factors were optimized by the Response surface methodology. The independent variables were the CuO mass fraction, mill revolution number, coating time, ball size, ball mass and sample mass. The response variable was the specific resistance. The optimization of six factors by the fractional factorial design indicated that CuO mass fraction, mill revolution number, and coating time were the key factors. The levels of these three factors were selected by the three-factors full factorial design and steepest ascent method. The steepest ascent method was used to approach the optimum range for maximum specific resistance. The Box-Behnken design was finally used to analyze the response surfaces of the screened factors for further optimization. The results of the Box-Behnken design showed that the CuO mass fraction and mill revolution number were the main factors affecting the efficiency of coating process. As the CuO mass fraction increased, the specific resistance increased. In contrast, the specific resistance increased with decreasing mill revolution number. The process optimization results revealed a high agreement between the experimental and the predicted data (Adj-R2=0.944). The optimized CuO mass fraction, mill revolution number, and coating time were 0.4, 200 rpm, and 15 min, respectively. The measured value of the specific resistance of the coated pellet under the optimized conditions of the maximum specific resistance was 530 kΩ·cm.

  11. Design methodology for wing trailing edge device mechanisms

    OpenAIRE

    Martins Pires, Rui Miguel

    2007-01-01

    Over the last few decades the design of high lift devices has become a very important part of the total aircraft design process. Reviews of the design process are performed on a regular basis, with the intent to improve and optimize the design process. This thesis describes a new and innovative methodology for the design and evaluation of mechanisms for Trailing Edge High-Lift devices. The initial research reviewed existing High-Lift device design methodologies and current f...

  12. Hybrid probabilistic and possibilistic safety assessment. Methodology and application

    International Nuclear Information System (INIS)

    Kato, Kazuyuki; Amano, Osamu; Ueda, Hiroyoshi; Ikeda, Takao; Yoshida, Hideji; Takase, Hiroyasu

    2002-01-01

    This paper presents a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to the safety assessment of geological disposal of high-level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts, while variability was formulated by means of probability density functions (pdfs) based on available data sets. The exercise demonstrated the applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert opinion and in providing information on the dependence of assessment results on the level of conservatism. In addition, it was shown that sensitivity analysis can identify key parameters contributing to uncertainties associated with results of the overall assessment. The information mentioned above can be utilized to support decision-making and to guide the process of disposal system development and optimization of protection against potential exposure. (author)

  13. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  14. Finite-State Methodology in Natural Language Processing

    Directory of Open Access Journals (Sweden)

    Michal Korzycki

    2001-01-01

    Full Text Available Recent mathematical and algorithmic results in the field of finite-state technology, as well the increase in computing power, have constructed the base for a new approach in natural language processing. However the task of creating an appropriate model that would describe the phenomena of the natural language is still to be achieved. ln this paper I'm presenting some notions related to the finite-state modelling of syntax and morphology.

  15. A METHODOLOGY FOR IMPROVING PRODUCTIVITY OF THE EXISTING SHIPBUILDING PROCESS USING MODERN PRODUCTION CONCEPTs AND THE AHP METHOD

    Directory of Open Access Journals (Sweden)

    Venesa Stanić

    2017-01-01

    Full Text Available In recent years, shipyards have been facing difficulties in controlling operational costs. To maintain continual operation of all of the facilities, a shipyard must analyze ways of utilizing present production systems for assembling interim vessel products as well as other types of industrial constructions. In the past, new machines continuously improved shipbuilding processes, including software and organizational restructuring, but management continued to search for a modern technological concept that will provide higher productivity, greater profit and overall reduction in costs. In the article the authors suggest implementing Design for Production, Design for Maintainability and Group Technology principles using the Analytical Hierarchy Process (AHP to apply to multi criteria decision making methods as an efficient tool for maintaining international competitiveness in the modern shipbuilding industry. This novel methodology is implemented through four phases. In the first phase, the present situation analysis is suggested for a real shipyard by establishing closest relations among production lines. The second phase presents a constraint analysis that must be evaluated when developing the design solution. The third phase involves generating a typical number of selected alternatives of the Design for Production, Design for Maintainability and Group Technology principles. In the fourth phase, the optimal design solution is selected using the Analytical Hierarchy Process (AHP method. The solution incorporating this modern methodology will improve productivity, profit and lead to decreasing operational costs.

  16. Adaptation of Agile Project Management Methodology for Project Team

    Directory of Open Access Journals (Sweden)

    Rasnacis Arturs

    2015-12-01

    Full Text Available A project management methodology that defines basic processes, tools, techniques, methods, resources and procedures used to manage a project is necessary for effective and successful IT project management. Each company needs to define its own methodology or adapt some of the existing ones. The purpose of the research is to evaluate the possibilities of adapting IT project development methodology according to the company, company employee characteristics and their mutual relations. The adaptation process will be illustrated with a case study at an IT company in Latvia where the developed methodology is based on Agile Scrum, one of the most widespread Agile methods.

  17. Optimization of an A(2)/O process for tetracycline removal via response surface methodology coupled with a Box-Behnken design.

    Science.gov (United States)

    Qi, Fang-Fang; Huang, Man-Hong; Zheng, Yu; Xu, Qi

    2015-01-01

    Response surface methodology (RSM) was used to optimize the operating conditions of an anaerobic-anoxic-oxic (A(2)/O) process by maximizing the removal efficiency of tetracycline (TC). Solid retention time (SRT), hydraulic retention time (HRT) and initial TC concentration (CTC, in) were selected as independent variables for incorporation in the Box-Behnken design. The results showed SRT and CTC, in were more significant parameters than HRT for the removal efficiency of TC. TC could be completely removed under the optimal conditions of an SRT of 15.5 days, an HRT of 9.9 h and a CTC, in of 283.3 μg L(-1). TC removal efficiencies of 99% and 96% were attained for synthetic and real wastewater, respectively, under the optimal conditions. This indicated the constructed model was validated and reliable for optimizing the A(2)/O process for TC removal.

  18. METHODOLOGY TO CREATE DIGITAL AND VIRTUAL 3D ARTEFACTS IN ARCHAEOLOGY

    Directory of Open Access Journals (Sweden)

    Calin Neamtu

    2016-12-01

    Full Text Available The paper presents a methodology to create 3D digital and virtual artefacts in the field of archaeology using CAD software solution. The methodology includes the following steps: the digitalization process, the digital restoration and the dissemination process within a virtual environment. The resulted 3D digital artefacts have to be created in files formats that are compatible with a large variety of operating systems and hardware configurations such as: computers, graphic tablets and smartphones. The compatibility and portability of these 3D file formats has led to a series of quality related compromises to the 3D models in order to integrate them on in a wide variety of application that are running on different hardware configurations. The paper illustrates multiple virtual reality and augmented reality application that make use of the virtual 3D artefacts that have been generated using this methodology.

  19. A first-principles generic methodology for representing the knowledge base of a process diagnostic expert system

    International Nuclear Information System (INIS)

    Reifman, J.; Briggs, L.L.; Wei, T.Y.C.

    1990-01-01

    In this paper we present a methodology for identifying faulty component candidates of process malfunctions through basic physical principles of conservation, functional classification of components and information from the process schematics. The basic principles of macroscopic balance of mass, momentum and energy in thermal hydraulic control volumes are applied in a novel approach to incorporate deep knowledge into the knowledge base. Additional deep knowledge is incorporated through the functional classification of process components according to their influence in disturbing the macroscopic balance equations. Information from the process schematics is applied to identify the faulty component candidates after the type of imbalance in the control volumes is matched against the functional classification of the components. Except for the information from the process schematics, this approach is completely general and independent of the process under consideration. The use of basic first-principles, which are physically correct, and the process-independent architecture of the diagnosis procedure allow for the verification and validation of the system. A prototype process diagnosis expert system is developed and a test problem is presented to identify faulty component candidates in the presence of a single failure in a hypothetical balance of plant of a liquid metal nuclear reactor plant

  20. Development of a methodology for automated assessment of the quality of digitized images in mammography

    International Nuclear Information System (INIS)

    Santana, Priscila do Carmo

    2010-01-01

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  1. Methodological and Analytical Dilemmas in Autoethnographic Research

    Directory of Open Access Journals (Sweden)

    Elena Maydell

    2010-01-01

    Full Text Available This article presents an argument on the application of theoretical and methodological frameworks to the study of identity from an autoethnographic perspective. In order to guide the analysis process, the author employed social constructionism as the main theoretical foundation, whereas thematic analysis and positioning theory were deployed as the methodological frameworks. Further, in the process of using ethnographic methods to study the identity of Russian immigrants to New Zealand, the author found herself also needing to use autoethnography to interrogate and understand her own journey. The insider/outsider position of the author who belongs to the same minority group became the most vital tool in her identity construction. In this regard, it is impossible to engage fully with the autoethnographic research practice without understanding the impact of others on identity construction of self, and a strong theoretical and methodological scholarship can provide a valuable foundation for this process.

  2. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    Science.gov (United States)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  3. A Comparative Analysis of Taguchi Methodology and Shainin System DoE in the Optimization of Injection Molding Process Parameters

    Science.gov (United States)

    Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik

    2017-08-01

    Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.

  4. Methodology of impact assessment of research projects

    International Nuclear Information System (INIS)

    Rodriguez Cardona, R.; Cobas Aranda, M.

    2014-01-01

    In the context of the management of research projects development it is necessary to have tools to monitor and evaluate progress and the performance of the projects, as well as their results and the impact on society (international agencies of the United Nations and the States 2002 and 2005 Paris Declaration), with the objective of to ensure their contribution to the social and economic development of countries. Many organizations, agencies and Governments apply different methodologies (IDB, World Bank, UNDP, ECLAC, UNESCO; UNICEF, Canada, Japan, other) for these purposes. In the results-based project management system not only paramount is the process or product itself, but also the result or impact of the project (if the program/project produced the effects desired persons, households and institutions and whether those effects are attributable to the intervention of the program / project). The work shows a methodology that allows for a qualitative and quantitative evaluation of impact of research projects and has been result of experience in project management of international collaboration with the International Agency for Atomic Energy (IAEA) and the Cuban Nuclear programme. (author)

  5. ANL calculational methodologies for determining spent nuclear fuel source term

    International Nuclear Information System (INIS)

    McKnight, R. D.

    2000-01-01

    Over the last decade Argonne National Laboratory has developed reactor depletion methods and models to determine radionuclide inventories of irradiated EBR-II fuels. Predicted masses based on these calculational methodologies have been validated using available data from destructive measurements--first from measurements of lead EBR-II experimental test assemblies and later using data obtained from processing irradiated EBR-II fuel assemblies in the Fuel Conditioning Facility. Details of these generic methodologies are described herein. Validation results demonstrate these methods meet the FCF operations and material control and accountancy requirements

  6. The Campesino-to-Campesino agroecology movement of ANAP in Cuba: social process methodology in the construction of sustainable peasant agriculture and food sovereignty.

    Science.gov (United States)

    Rosset, Peter Michael; Sosa, Braulio Machín; Jaime, Adilén María Roque; Lozano, Dana Rocío Ávila

    2011-01-01

    Agroecology has played a key role in helping Cuba survive the crisis caused by the collapse of the socialist bloc in Europe and the tightening of the US trade embargo. Cuban peasants have been able to boost food production without scarce and expensive imported agricultural chemicals by first substituting more ecological inputs for the no longer available imports, and then by making a transition to more agroecologically integrated and diverse farming systems. This was possible not so much because appropriate alternatives were made available, but rather because of the Campesino-a-Campesino (CAC) social process methodology that the National Association of Small Farmers (ANAP) used to build a grassroots agroecology movement. This paper was produced in a 'self-study' process spearheaded by ANAP and La Via Campesina, the international agrarian movement of which ANAP is a member. In it we document and analyze the history of the Campesino-to-Campesino Agroecology Movement (MACAC), and the significantly increased contribution of peasants to national food production in Cuba that was brought about, at least in part, due to this movement. Our key findings are (i) the spread of agroecology was rapid and successful largely due to the social process methodology and social movement dynamics, (ii) farming practices evolved over time and contributed to significantly increased relative and absolute production by the peasant sector, and (iii) those practices resulted in additional benefits including resilience to climate change.

  7. Methodology of environmental risk assessment management

    Directory of Open Access Journals (Sweden)

    Saša T. Bakrač

    2012-04-01

    Full Text Available Successful protection of environment is mostly based on high-quality assessment of potential and present risks. Environmental risk management is a complex process which includes: identification, assessment and control of risk, namely taking measures in order to minimize the risk to an acceptable level. Environmental risk management methodology: In addition to these phases in the management of environmental risk, appropriate measures that affect the reduction of risk occurrence should be implemented: - normative and legal regulations (laws and regulations, - appropriate organizational structures in society, and - establishing quality monitoring of environment. The emphasis is placed on the application of assessment methodologies (three-model concept, as the most important aspect of successful management of environmental risk. Risk assessment methodology - European concept: The first concept of ecological risk assessment methodology is based on the so-called European model-concept. In order to better understand this ecological risk assessment methodology, two concepts - hazard and risk - are introduced. The European concept of environmental risk assessment has the following phases in its implementation: identification of hazard (danger, identification of consequences (if there is hazard, estimate of the scale of consequences, estimate of consequence probability and risk assessment (also called risk characterization. The European concept is often used to assess risk in the environment as a model for addressing the distribution of stressors along the source - path - receptor line. Risk assessment methodology - Canadian concept: The second concept of the methodology of environmental risk assessment is based on the so-called Canadian model-concept. The assessment of ecological risk includes risk arising from natural events (floods, extreme weather conditions, etc., technological processes and products, agents (chemical, biological, radiological, etc

  8. Monitoring and diagnosis for sensor fault detection using GMDH methodology

    International Nuclear Information System (INIS)

    Goncalves, Iraci Martinez Pereira

    2006-01-01

    The fault detection and diagnosis system is an Operator Support System dedicated to specific functions that alerts operators to sensors and actuators fault problems, and guide them in the diagnosis before the normal alarm limits are reached. Operator Support Systems appears to reduce panels complexity caused by the increase of the available information in nuclear power plants control room. In this work a Monitoring and Diagnosis System was developed based on the GMDH (Group Method of Data Handling) methodology. The methodology was applied to the IPEN research reactor IEA-R1. The system performs the monitoring, comparing GMDH model calculated values with measured values. The methodology developed was firstly applied in theoretical models: a heat exchanger model and an IPEN reactor theoretical model. The results obtained with theoretical models gave a base to methodology application to the actual reactor operation data. Three GMDH models were developed for actual operation data monitoring: the first one using just the thermal process variables, the second one was developed considering also some nuclear variables, and the third GMDH model considered all the reactor variables. The three models presented excellent results, showing the methodology utilization viability in monitoring the operation data. The comparison between the three developed models results also shows the methodology capacity to choose by itself the best set of input variables for the model optimization. For the system diagnosis implementation, faults were simulated in the actual temperature variable values by adding a step change. The fault values correspond to a typical temperature descalibration and the result of monitoring faulty data was then used to build a simple diagnosis system based on fuzzy logic. (author)

  9. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  10. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  11. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2015-10-15

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant.

  12. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    International Nuclear Information System (INIS)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung

    2015-01-01

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant

  13. Methodology for assessing laser-based equipment

    Science.gov (United States)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  14. Energy index decomposition methodology at the plant level

    Science.gov (United States)

    Kumphai, Wisit

    Scope and method of study. The dissertation explores the use of a high level energy intensity index as a facility-level energy performance monitoring indicator with a goal of developing a methodology for an economically based energy performance monitoring system that incorporates production information. The performance measure closely monitors energy usage, production quantity, and product mix and determines the production efficiency as a part of an ongoing process that would enable facility managers to keep track of and, in the future, be able to predict when to perform a recommissioning process. The study focuses on the use of the index decomposition methodology and explored several high level (industry, sector, and country levels) energy utilization indexes, namely, Additive Log Mean Divisia, Multiplicative Log Mean Divisia, and Additive Refined Laspeyres. One level of index decomposition is performed. The indexes are decomposed into Intensity and Product mix effects. These indexes are tested on a flow shop brick manufacturing plant model in three different climates in the United States. The indexes obtained are analyzed by fitting an ARIMA model and testing for dependency between the two decomposed indexes. Findings and conclusions. The results concluded that the Additive Refined Laspeyres index decomposition methodology is suitable to use on a flow shop, non air conditioned production environment as an energy performance monitoring indicator. It is likely that this research can be further expanded in to predicting when to perform a recommissioning process.

  15. Methodologies for Reservoir Characterization Using Fluid Inclusion Gas Chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Dilley, Lorie M. [Hattenburg Dilley & Linnell, LLC, Anchorage, AL (United States)

    2015-04-13

    The purpose of this project was to: 1) evaluate the relationship between geothermal fluid processes and the compositions of the fluid inclusion gases trapped in the reservoir rocks; and 2) develop methodologies for interpreting fluid inclusion gas data in terms of the chemical, thermal and hydrological properties of geothermal reservoirs. Phase 1 of this project was designed to conduct the following: 1) model the effects of boiling, condensation, conductive cooling and mixing on selected gaseous species; using fluid compositions obtained from geothermal wells, 2) evaluate, using quantitative analyses provided by New Mexico Tech (NMT), how these processes are recorded by fluid inclusions trapped in individual crystals; and 3) determine if the results obtained on individual crystals can be applied to the bulk fluid inclusion analyses determined by Fluid Inclusion Technology (FIT). Our initial studies however, suggested that numerical modeling of the data would be premature. We observed that the gas compositions, determined on bulk and individual samples were not the same as those discharged by the geothermal wells. Gases discharged from geothermal wells are CO2-rich and contain low concentrations of light gases (i.e. H2, He, N, Ar, CH4). In contrast many of our samples displayed enrichments in these light gases. Efforts were initiated to evaluate the reasons for the observed gas distributions. As a first step, we examined the potential importance of different reservoir processes using a variety of commonly employed gas ratios (e.g. Giggenbach plots). The second technical target was the development of interpretational methodologies. We have develop methodologies for the interpretation of fluid inclusion gas data, based on the results of Phase 1, geologic interpretation of fluid inclusion data, and integration of the data. These methodologies can be used in conjunction with the relevant geological and hydrological information on the system to

  16. REDUCING REJECTION/REWORK IN PRESSURE DIE CASTING PROCESS BY APPLICATION OF DMAIC METHODOLOGY OF SIX SIGMA

    Directory of Open Access Journals (Sweden)

    Javedhusen Malek

    2015-12-01

    Full Text Available In today's ever-changing customer driven market, industries are needed to improve their products and processes to satisfy customer requirements. The Six Sigma approach has set a new paradigm of business excellence. Six Sigma as a process driven improvement methodology has been adopted successfully by many industries. From the review of various literatures, it is revealed that Six Sigma is well adopted in large scale enterprise but having less evidence of adoption in Indian SMEs. This paper is focused on providing path to Indian SMEs for initiating Six Sigma approach in their industries. The paper discusses the real life case where Six Sigma has been successfully applied at one of the Indian small-scale unit to improve rejection/rework rate in manufacturing products by pressure die casting process. This paper describes phase wise application of all the phases of define-measure-analyse-improve-control (DMAIC which also shows impact of Six Sigma in quality improvement.

  17. Methodology to evaluate the impact of the erosion in cultivated floors applying the technique of the 137CS

    International Nuclear Information System (INIS)

    Gil Castillo, R.; Peralta Vital, J.L.; Carrazana, J.; Riverol, M.; Penn, F.; Cabrera, E.

    2004-01-01

    The present paper shows the results obtained in the framework of 2 Nuclear Projects, in the topic of application of nuclear techniques to evaluate the erosion rates in cultivated soils. Taking into account the investigations with the 137 CS technique, carried out in the Province of Pinar del Rio, was obtained and validated (first time) a methodology to evaluate the erosion impact in a cropland. The obtained methodology includes all relevant stages for the adequate application of the 137 CS technique, from the initial step of area selection, the soil sampling process, selection of the models and finally, the results evaluation step. During the methodology validation process in soils of the Municipality of San Juan y Martinez, the erosion rates estimated by the methodology and the obtained values by watershed segment measures (traditional technique) were compared in a successful manner. The methodology is a technical guide, for the adequate application of the 137 CS technique to estimate the soil redistribution rates in cultivated soils

  18. A design methodology to reduce waste in the construction process

    Institute of Scientific and Technical Information of China (English)

    AndrewN.BALDWIN; SimonA.AUSTIN; AndrewKEYS

    2003-01-01

    This paper describes a conceptual tool to enable construction professional to identify where waste is generated during the construction of buildings and address how it can be reduced. It allows an improvement in the waste management practices on site by forecasting future waste types and volumes. It will reduce waste volumes on site through identification of wasteful design practices. The tool contributes to all stages of design and construction. At the Concept Stage of Design the proposed methodology provides a framework for reducing waste through better informed decisions. At the Detailed Design Stage it gives a methodology to address the areas of concern and provide focused information to aid the reduction of waste through informed design decisions. During construction it provides a tool to predict waste types arising on site thus allowing a system of proaclive waste management that will aid skip segregation strategies leading to improved waste recycling and waste reuse.

  19. A methodology for string resolution

    International Nuclear Information System (INIS)

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine's status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records

  20. An intelligent design methodology for nuclear power systems

    International Nuclear Information System (INIS)

    Nassersharif, B.; Martin, R.P.; Portal, M.G.; Gaeta, M.J.

    1989-01-01

    The goal of this investigation is to research possible methodologies into automating the design of, specifically, nuclear power facilities; however, it is relevant to all thermal power systems. The strategy of this research has been to concentrate on individual areas of the thermal design process, investigate procedures performed, develop methodology to emulate that behavior, and prototype it in the form of a computer program. The design process has been generalized as follows: problem definition, design definition, component selection procedure, optimization and engineering analysis, testing and final design with the problem definition defining constraints that will be applied to the selection procedure as well as design definition. The result of this research is a prototype computer program applying an original procedure for the selection of the best set of real components that would be used in constructing a system with desired performance characteristics. The mathematical model used for the selection procedure is possibility theory

  1. Methodologies for certification of transuranic waste packages

    International Nuclear Information System (INIS)

    Christensen, R.N.; Kok, K.D.

    1980-10-01

    The objective of this study was to postulate methodologies for certification that a waste package is acceptable for disposal in a licensed geologic repository. Within the context of this report, certification means the overall process which verifies that a waste package meets the criteria or specifications established for acceptance for disposal in a repository. The overall methodology for certification will include (1) certifying authorities, (2) tests and procedures, and (3) documentation and quality assurance programs. Each criterion will require a methodology that is specific to that criterion. In some cases, different waste forms will require a different methodology. The purpose of predicting certification methodologies is to provide additional information as to what changes, if any, are needed for the TRU waste in storage

  2. Methodology and results of risk assessment of interconnections within the JET active gas handling system

    International Nuclear Information System (INIS)

    Ballantyne, P.R.; Bell, A.C.; Konstantellos, A.; Hemmerich, J.L.

    1992-01-01

    The Joint European Torus (JET) Active Gas Handling System (AGHS) is a complex interconnection of numerous subsystems. While individual subsystems were assessed for their risk of operation, an assessment of the effects of inadvertent interconnections was needed. A systematic method to document the assessment was devised to ease the assessment of complex plant and was applied to the AGHS. The methodology, application to AGHS, the four critical issues and required plant modifications as a result of this assessment are briefly discussed in this paper

  3. The impact of methodology in innovation measurement

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, L.; Bugge, M.; Solberg, E.

    2016-07-01

    Innovation surveys and rankings such as the Community Innovation Survey (CIS) and Innovation Union Scoreboard (IUS) have developed into influential diagnostic tools that are often used to categorize countries according to their innovation performance and to legitimise innovation policies. Although a number of ongoing processes are seeking to improve existing frameworks for measuring innovation, there are large methodological differences across countries in the way innovation is measured. This causes great uncertainty regarding a) the coherence between data from innovation surveys, b) actual innovativeness of the economy, and c) the validity of research based on innovation data. Against this background we explore empirically how different survey methods for measuring innovation affect reported innovation performance. The analysis is based on a statistical exercise comparing the results from three different methodological versions of the same survey for measuring innovation in the business enterprise sector in Norway. We find striking differences in reported innovation performance depending on how the surveys are carried out methodologically. The paper concludes that reported innovation performance is highly sensitive to and strongly conditioned by methodological context. This represents a need for increased caution and awareness around data collection and research based on innovation data, and not least in terms of aggregation of data and cross-country comparison. (Author)

  4. Removal of Penicillin G by combination of sonolysis and Photocatalytic (sonophotocatalytic) process from aqueous solution: process optimization using RSM (Response Surface Methodology).

    Science.gov (United States)

    Almasi, Ali; Dargahi, Abdollah; Mohamadi, Mitra; Biglari, Hamed; Amirian, Farhad; Raei, Mehdi

    2016-09-01

    Penicillin G (PG) is used in a variety of infectious diseases, extensively. Generally, when antibiotics are introduced into the food chain, they pose a threat to the environment and can risk health outcomes. The aim of the present study was the removal of Penicillin G from an aqueous solution through an integrated system of UV/ZnO and UV/WO 3 with Ultrasound pretreatment. In this descriptive-analytical work dealing with the removal of Penicillin G from an aqueous solution, four significant variables, contact time (60-120 min), Penicillin G concentration (50-150 mg/L), ZnO dose (200-400 mg/L), and WO 3 dose (100-200 mg/L) were investigated. Experiments were performed in a Pyrex reactor (batch, 1 Lit) with an artificial UV 100-Watt medium pressure mercury lamp, coupled with ultrasound (100 W, 40 KHz) for PG pre-treatment. Chemical Oxygen Demand (COD) was selected to follow the performance of the photo-catalytic process and sonolysis. The experiments were based on a Central Composite Design (CCD) and analyzed by Response Surface Methodology (RSM). A mathematical model of the process was designed according to the proposed degradation scheme. The results showed that the maximum removal of PG occurred in ultrasonic/UV/WO 3 in the presence of 50 mg/L WO 3 and contact time of 120 minutes. In addition, an increase in the PG concentration caused a decrease in COD removal. As the initial concentration of the catalyst increased, the COD removal also increased. The maximum COD removal (91.3%) achieved by 200 mg/L WO 3 and 400 mg/l ZnO, a contact time of 120 minutes, and an antibiotic concentration of 50 mg/L. All of the variables in the process efficiency were found to be significant (p research data supported the conclusion that the combination of advanced oxidation process of sonolysis and photocatalytic (sonophotocatalytic) were applicable and environmentally friendly processes, which preferably can be applied extensively.

  5. Results from CrIS-ATMS Obtained Using the AIRS Science Team Retrieval Methodology

    Science.gov (United States)

    Susskind, Joel; Kouvaris, Louis C.; Iredell, Lena

    2013-01-01

    AIRS was launched on EOS Aqua in May 2002, together with AMSU-A and HSB (which subsequently failed early in the mission), to form a next generation polar orbiting infrared and microwave atmospheric sounding system. AIRS/AMSU had two primary objectives. The first objective was to provide real-time data products available for use by the operational Numerical Weather Prediction Centers in a data assimilation mode to improve the skill of their subsequent forecasts. The second objective was to provide accurate unbiased sounding products with good spatial coverage that are used to generate stable multi-year climate data sets to study the earth's interannual variability, climate processes, and possibly long-term trends. AIRS/AMSU data for all time periods are now being processed using the state of the art AIRS Science Team Version-6 retrieval methodology. The Suomi-NPP mission was launched in October 2011 as part of a sequence of Low Earth Orbiting satellite missions under the "Joint Polar Satellite System" (JPSS). NPP carries CrIS and ATMS, which are advanced infra-red and microwave atmospheric sounders that were designed as follow-ons to the AIRS and AMSU instruments. The main objective of this work is to assess whether CrIS/ATMS will be an adequate replacement for AIRS/AMSU from the perspective of the generation of accurate and consistent long term climate data records, or if improved instruments should be developed for future flight. It is critical for CrIS/ATMS to be processed using an algorithm similar to, or at least comparable to, AIRS Version-6 before such an assessment can be made. We have been conducting research to optimize products derived from CrIS/ATMS observations using a scientific approach analogous to the AIRS Version-6 retrieval algorithm. Our latest research uses Version-5.70 of the CrIS/ATMS retrieval algorithm, which is otherwise analogous to AIRS Version-6, but does not yet contain the benefit of use of a Neural-Net first guess start-up system

  6. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  7. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  8. Methodologies used in Project Management

    OpenAIRE

    UNGUREANU, Adrian; UNGUREANU, Anca

    2014-01-01

    Undoubtedly, a methodology properly defined and strictly followed for project management provides a firm guarantee that the work will be done on time, in budget and according to specifications. A project management methodology in simple terms is a “must-have” to avoid failure and reduce risks, because is one of the critical success factors, such basic skills of the management team. This is the simple way to guide the team through the design and execution phases, processes and tasks throughout...

  9. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  10. Gamification: Methodology to Engage and Motivate Students in the Learning Process

    Directory of Open Access Journals (Sweden)

    Inés ARAÚJO

    2016-05-01

    Full Text Available Gamification is a recent concept and is projected as a technological trend to implement in schools by 2017 (Johnson, Adams Becker Road & Freeman, 2014a, 2014b, 2014c. Currently the majority of application examples of Gamification, including the educational context, are to use Buttons / Badges, Leaderboards and Scores. Several authors (Burke, 2014; Deterding, 2014; Kapp, 2012; Zichermann, 2013 emphasize that the Gamification cannot be restricted to the mere application of these game mechanics into any contexts wanted to be gamified. It is necessary to know the interests of the audience, their needs, what can motivate them and plan a gamify activity that meets these expectations. For all these reasons is important to develop studies to understand how this process could be implemented more effectively in educational contexts, enabling the development of appropriate tools and creating guidelines that can guide those who want to include Gamification in their teaching practice. This article presents a literature review on the concept of Gamification, describing some relevant examples that make it easier to understand how it can be implemented, proposing questions to ponder when applying this new methodology to educational contexts.

  11. Development of a numerical methodology for flowforming process simulation of complex geometry tubes

    Science.gov (United States)

    Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca

    2017-10-01

    Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.

  12. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  13. Definition of a short-cut methodology for assessing earthquake-related Na-Tech risk

    International Nuclear Information System (INIS)

    Busini, Valentina; Marzo, Enrico; Callioni, Andrea; Rota, Renato

    2011-01-01

    Highlights: → In industrial sites located in natural hazard-prone areas technological accidents may be triggered by natural events, generating the so-called Na-Tech accidents. → In this paper, a qualitative screening methodology for assessing the earthquake Na-Tech risk has been developed with the aim of identifying which situations deserve a much more expensive Quantitative Risk Analysis. → The simple procedure developed, which summarizes in a suitable Key Hazard Indicator the Na-Tech risk level, has been validated by comparing its results with those of some Quantitative Risk Analyses involving also Na-Tech events and previously presented in the literature. - Abstract: Na-Tech (Natural and Technological) refers to industrial accidents triggered by natural events such as storms, earthquakes, flooding, and lightning. Herein, a qualitative methodology for the initial assessment of earthquake Na-Tech risk has been developed as a screening tool to identify which situations require a much more expensive Quantitative Risk Analysis (QRA). The proposed methodology, through suitable Key Hazard Indicators (KHIs), identifies the Na-Tech risk level associated with a given situation (i.e., a process plant located in a given territory), using the Analytical Hierarchy Process as a multi-criteria decision tool for the evaluation of such KHIs. The developed methodology was validated by comparing its computational results with QRA results that involved Na-Tech events previously presented in literature.

  14. Systems engineering agile design methodologies

    CERN Document Server

    Crowder, James A

    2013-01-01

    This book examines the paradigm of the engineering design process. The authors discuss agile systems and engineering design. The book captures the entire design process (functionbases), context, and requirements to affect real reuse. It provides a methodology for an engineering design process foundation for modern and future systems design. This book captures design patterns with context for actual Systems Engineering Design Reuse and contains a new paradigm in Design Knowledge Management.

  15. Pedagogical support of competence formation: methodological bases and experimental context

    OpenAIRE

    NABIEV VALERY SHARIFYANOVICH

    2016-01-01

    The article considers the problem of competence approach methodological basis. It discusses the topical issues of organizing a holistic educational process. The article presents the original solutions created by the author and the results of experimental verification of the specified conditions of pedagogical maintenance of educational and training activities.

  16. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  17. Utilizing Lean Six Sigma Methodology to Improve the Authored Works Command Approval Process at Naval Medical Center San Diego.

    Science.gov (United States)

    Valdez, Michelle M; Liwanag, Maureen; Mount, Charles; Rodriguez, Rechell; Avalos-Reyes, Elisea; Smith, Andrew; Collette, David; Starsiak, Michael; Green, Richard

    2018-03-14

    Inefficiencies in the command approval process for publications and/or presentations negatively impact DoD Graduate Medical Education (GME) residency programs' ability to meet ACGME scholarly activity requirements. A preliminary review of the authored works approval process at Naval Medical Center San Diego (NMCSD) disclosed significant inefficiency, variation in process, and a low level of customer satisfaction. In order to facilitate and encourage scholarly activity at NMCSD, and meet ACGME requirements, the Executive Steering Council (ESC) chartered an interprofessional team to lead a Lean Six Sigma (LSS) Rapid Improvement Event (RIE) project. Two major outcome metrics were identified: (1) the number of authored works submissions containing all required signatures and (2) customer satisfaction with the authored works process. Primary metric baseline data were gathered utilizing a Clinical Investigations database tracking publications and presentations. Secondary metric baseline data were collected via a customer satisfaction survey to GME faculty and residents. The project team analyzed pre-survey data and utilized LSS tools and methodology including a "gemba" (environment) walk, cause and effect diagram, critical to quality tree, voice of the customer, "muda" (waste) chart, and a pre- and post-event value stream map. The team selected an electronic submission system as the intervention most likely to positively impact the RIE project outcome measures. The number of authored works compliant with all required signatures improved from 52% to 100%. Customer satisfaction rated as "completely or mostly satisfied" improved from 24% to 97%. For both outcomes, signature compliance and customer satisfaction, statistical significance was achieved with a p methodology and tools to improve signature compliance and increase customer satisfaction with the authored works approval process, leading to 100% signature compliance, a comprehensive longitudinal repository of all

  18. Optimization and Modeling of Process Variables of Biodiesel Production from Marula Oil using Response Surface Methodology

    International Nuclear Information System (INIS)

    Enweremadu, C. C.; Rutto, H. L.

    2015-01-01

    This paper presents an optimization study in the production of biodiesel production from Marula oil. The study was carried out using a central composite design of experiments under response surface methodology. A mathematical model was developed to correlate the transesterification process variables to biodiesel yield. The transesterification reaction variables were methanol to oil ratio, x /sub 1/ (10-50 wt percentage), reaction time, x /sub 2/ (30-90 min), reaction temperature, x /sub 3/ (30-90 Degree C) stirring speed, x /sub 4/ (100-400 rpm) and amount of catalyst, x /sub 5/ (0.5-1.5 g). The optimum conditions for the production of the biodiesel were found to be methanol to oil ratio (29.43 wt percentage), reaction time (59.17 minutes), reaction temperature (58.80 Degree C), stirring speed (325 rpm) and amount of catalyst (1.02 g). The optimum yield of biodiesel that can be produced was 95 percentage. The results revealed that the crucial fuel properties of the biodiesel produced at the optimum conditions met the ASTM biodiesel specifications. (author)

  19. Interfacing systems loss of coolant accident (ISLOCA) pressure capacity methodology and Davis-Besse results

    International Nuclear Information System (INIS)

    Wesley, D.A.

    1991-01-01

    A loss of coolant accident resulting from the overpressurization by reactor coolant fluid of a system designed for low-pressure, low-temperature service has been identified as a potential contributor to nuclear power plant risk. In this paper, the methodology developed to assess the probability of failure as a function of internal pressure is presented, and sample results developed for the controlling failure modes and locations of four fluid systems at the Davis-Besse Plant are shown. Included in this evaluation are the tanks, heat exchangers, filters, pumps, valves, and flanged connections for each system. The variability in the probability of failure is included, and the estimated leak rates or leak areas are given for the controlling modes of failure. For this evaluation, all failures are based on quasistatic pressures since the probability of dynamic effects resulting from such causes as water hammer have been initially judged to be negligible for the Davis-Besse plant ISLOCA

  20. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  1. Variable identification in group method of data handling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Iraci Martinez, E-mail: martinez@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Bueno, Elaine Inacio [Instituto Federal de Educacao, Ciencia e Tecnologia, Guarulhos, SP (Brazil)

    2011-07-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  2. Variable identification in group method of data handling methodology

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Bueno, Elaine Inacio

    2011-01-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  3. Design and application of process control charting methodologies to gamma irradiation practices

    Science.gov (United States)

    Saylor, M. C.; Connaghan, J. P.; Yeadon, S. C.; Herring, C. M.; Jordan, T. M.

    2002-12-01

    The relationship between the contract irradiation facility and the customer has historically been based upon a "PASS/FAIL" approach with little or no quality metrics used to gage the control of the irradiation process. Application of process control charts, designed in coordination with mathematical simulation of routine radiation processing, can provide a basis for understanding irradiation events. By using tools that simulate the physical rules associated with the irradiation process, end-users can explore process-related boundaries and the effects of process changes. Consequently, the relationship between contractor and customer can evolve based on the derived knowledge. The resulting level of mutual understanding of the irradiation process and its resultant control benefits both the customer and contract operation, and provides necessary assurances to regulators. In this article we examine the complementary nature of theoretical (point kernel) and experimental (dosimetric) process evaluation, and the resulting by-product of improved understanding, communication and control generated through the implementation of effective process control charting strategies.

  4. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  5. A methodology to generate statistically dependent wind speed scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Morales, J.M.; Conejo, A.J. [Department of Electrical Engineering, Univ. Castilla - La Mancha, Campus Universitario s/n, 13071 Ciudad Real (Spain); Minguez, R. [Environmental Hydraulics Institute ' ' IH Cantabria' ' , Univ. Cantabria, Avenida de los Castros s/n, 39005 Santander (Spain)

    2010-03-15

    Wind power - a renewable energy source increasingly attractive from an economic viewpoint - constitutes an electricity production alternative of growing relevance in current electric energy systems. However, wind power is an intermittent source that cannot be dispatched at the will of the producer. Modeling wind power production requires characterizing wind speed at the sites where the wind farms are located. The wind speed at a particular location can be described through a stochastic process that is spatially correlated with the stochastic processes describing wind speeds at other locations. This paper provides a methodology to characterize the stochastic processes pertaining to wind speed at different geographical locations via scenarios. Each one of these scenarios embodies time dependencies and is spatially dependent of the scenarios describing other wind stochastic processes. The scenarios generated by the proposed methodology are intended to be used within stochastic programming decision models to make informed decisions pertaining to wind power production. The methodology proposed is accurate in reproducing wind speed historical series as well as computationally efficient. A comprehensive case study is used to illustrate the capabilities of the proposed methodology. Appropriate conclusions are finally drawn. (author)

  6. A methodology to generate statistically dependent wind speed scenarios

    International Nuclear Information System (INIS)

    Morales, J.M.; Minguez, R.; Conejo, A.J.

    2010-01-01

    Wind power - a renewable energy source increasingly attractive from an economic viewpoint - constitutes an electricity production alternative of growing relevance in current electric energy systems. However, wind power is an intermittent source that cannot be dispatched at the will of the producer. Modeling wind power production requires characterizing wind speed at the sites where the wind farms are located. The wind speed at a particular location can be described through a stochastic process that is spatially correlated with the stochastic processes describing wind speeds at other locations. This paper provides a methodology to characterize the stochastic processes pertaining to wind speed at different geographical locations via scenarios. Each one of these scenarios embodies time dependencies and is spatially dependent of the scenarios describing other wind stochastic processes. The scenarios generated by the proposed methodology are intended to be used within stochastic programming decision models to make informed decisions pertaining to wind power production. The methodology proposed is accurate in reproducing wind speed historical series as well as computationally efficient. A comprehensive case study is used to illustrate the capabilities of the proposed methodology. Appropriate conclusions are finally drawn.

  7. Study of methodology diversification in diagnostics

    International Nuclear Information System (INIS)

    Suda, Kazunori; Yonekawa, Tsuyoshi; Yoshikawa, Shinji; Hasegawa, Makoto

    1999-03-01

    There are several research activities to enhance safety and reliability of nuclear power plant operation and maintenance. We are developing a concept of an autonomous operation system where the role of operators is replaced with artificial intelligence. The purpose of the study described in this report is to develop a operator support system in abnormal plant situations. Conventionally, diagnostic modules based on individual methodology such as expert system have been developed and verified. In this report, methodology diversification is considered to integrate diagnostic modules which performance are confirmed using information processing technique. Technical issues to be considered in diagnostic methodology diversification are; 1)reliability of input data, 2)diversification of knowledge models, algorithms and reasoning schemes, 3)mutual complement and robustness. The diagnostic module utilizing the different approaches defined along with strategy of diversification was evaluated using fast breeder plant simulator. As a result, we confirmed that any singular diagnostic module can not meet accuracy criteria for the entire set of anomaly events. In contrast with this, we confirmed that every abnormality could be precisely diagnosed by a mutual combination. In other words, legitimacy of approach selected by strategy of diversification was shown, and methodology diversification attained clear efficiency for abnormal diagnosis. It has been also confirmed that the diversified diagnostic system implemented in this study is able to maintain its accuracy even in case that encountered scale of abnormality is different from reference cases embedded in the knowledge base. (author)

  8. Properties of ABNT 41xx and 86xx cast steel modified with niobium; evaluation methodology and experimental preliminary results

    International Nuclear Information System (INIS)

    Suzczynski, E.F.; Chatterjee, S.; Mueller, Arno

    1982-01-01

    The experimental methodology to evaluate the mechanical properties of ABNT 41xx and 86xx steels modified with NB in the as cast and heat treated conditions and the first preliminary results obtained in a laboratory scale, are presented. (Author) [pt

  9. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  10. CHARACTERIZATION OF SMALL AND MEDIUM ENTERPRISES (SMES OF POMERANIAN REGION IN SIX SIGMA METHODOLOGY APPLICATION

    Directory of Open Access Journals (Sweden)

    2011-12-01

    Full Text Available Background: Six Sigma is related to product’s characteristics and parameters of actions, needed to obtain these products. On the other hand, it is a multi-step, cyclic process aimed at the improvements leading to global standard, closed to the perfection. There is a growing interest in Six Sigma methodology among smaller organizations but there are still too little publications presented such events in the sector of small and medium enterprises, especially based on good empirical results. It was already noticed at the phase of the preliminary researches, that only small part of companies from this sector in Pomerian region use elements of this methodology. Methods: The companies were divided into groups by the type of their activities as well as the employment size. The questionnaires were sent to 150 randomly selected organizations in two steps and were addressed to senior managers. The questionnaire contained the questions about basic information about a company, the level of the knowledge and the practical application of Six Sigma methodology, opinions about improvements of processes occurring in the company, opinions about trainings in Six Sigma methodology. Results: The following hypotheses were proposed, statistically verified and received the answer: The lack of the adequate knowledge of Six Sigma methodology in SMEs limits the possibility to effectively monitor and improve processes - accepted. The use of statistical tools of Six Sigma methodology requires the broad action to popularize this knowledge among national SMEs - accepted. The level of the awareness of the importance as well as practical use of Six Sigma methodology in manufacturing SMEs is higher than in SMEs providing services - rejected, the level is equal. The level of the knowledge and the use of Six Sigma methodology in medium manufacturing companies is significantly higher than in small manufacturing companies - accepted. The level of the knowledge and the application

  11. Guidance on the Technology Performance Level (TPL) Assessment Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Jochem [National Renewable Energy Lab. (NREL), Golden, CO (United States); Roberts, Jesse D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Babarit, Aurelien [Ecole Centrale de Nantes (France). Lab. of Research in Hydrodynamics, Energetics and Atmospheric Environment (LHEEA); Costello, Ronan [Wave Venture, Penstraze (United Kingdom); Bull, Diana L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neilson, Kim [Ramboll, Copenhagen (Denmark); Bittencourt, Claudio [DNV GL, London (United Kingdom); Kennedy, Ben [Wave Venture, Penstraze (United Kingdom); Malins, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dykes, Katherine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This document presents the revised Technology Performance Level (TPL) assessment methodology. There are three parts to this revised methodology 1) the Stakeholder Needs and Assessment Guidance (this document), 2) the Technical Submission form, 3) the TPL scoring spreadsheet. The TPL assessment is designed to give a technology neutral or agnostic assessment of any wave energy converter technology. The focus of the TPL is on the performance of the technology in meeting the customer’s needs. The original TPL is described in [1, 2] and those references also detail the critical differences in the nature of the TPL when compared to the more widely used technology readiness level (TRL). (Wave energy TRL is described in [3]). The revised TPL is particularly intended to be useful to investors and also to assist technology developers to conduct comprehensive assessments in a way that is meaningful and attractive to investors. The revised TPL assessment methodology has been derived through a structured Systems Engineering approach. This was a formal process which involved analyzing customer and stakeholder needs through the discipline of Systems Engineering. The results of the process confirmed the high level of completeness of the original methodology presented in [1] (as used in the Wave Energy Prize judging) and now add a significantly increased level of detail in the assessment and an improved more investment focused structure. The revised TPL also incorporates the feedback of the Wave Energy Prize judges.

  12. Design methodology and results evaluation of a heating functionality in modular lab-on-chip systems

    Science.gov (United States)

    Streit, Petra; Nestler, Joerg; Shaporin, Alexey; Graunitz, Jenny; Otto, Thomas

    2018-06-01

    Lab-on-a-chip (LoC) systems offer the opportunity of fast and customized biological analyses executed at the ‘point-of-need’ without expensive lab equipment. Some biological processes need a temperature treatment. Therefore, it is important to ensure a defined and stable temperature distribution in the biosensor area. An integrated heating functionality is realized with discrete resistive heating elements including temperature measurement. The focus of this contribution is a design methodology and evaluation technique of the temperature distribution in the biosensor area with regard to the thermal-electrical behaviour of the heat sources. Furthermore, a sophisticated control of the biosensor temperature is proposed. A finite element (FE) model with one and more integrated heat sources in a polymer-based LoC system is used to investigate the impact of the number and arrangement of heating elements on the temperature distribution around the heating elements and in the biosensor area. Based on this model, various LOC systems are designed and fabricated. Electrical characterization of the heat sources and independent temperature measurements with infrared technique are performed to verify the model parameters and prove the simulation approach. The FE model and the proposed methodology is the foundation for optimization and evaluation of new designs with regard to temperature requirements of the biosensor. Furthermore, a linear dependency of the heater temperature on the electric current is demonstrated in the targeted temperature range of 20 °C to 70 °C enabling the usage of the heating functionality for biological reactions requiring a steady-state temperature up to 70 °C. The correlation between heater and biosensor area temperature is derived for a direct control through the heating current.

  13. Evaluation methodology for fixed-site physical protection systems

    International Nuclear Information System (INIS)

    Bennett, H.A.; Olascoaga, M.T.

    1980-01-01

    A system performance evaluation methodology has been developed to aid the Nuclear Regulatory Commission (NRC) in the implementation of new regulations designed to upgrade the physical protection of nuclear fuel cycle facilities. The evaluation methodology, called Safeguards Upgrade Rule Evaluation (SURE), provides a means of explicitly incorporating measures for highly important and often difficult to quantify performance factors, e.g., installation, maintenance, training and proficiency levels, compatibility of components in subsystems, etc. This is achieved by aggregating responses to component and system questionaires through successive levels of a functional hierarchy developed for each primary performance capability specified in the regulations, 10 CFR 73.45. An overall measure of performance for each capability is the result of this aggregation process. This paper provides a descripton of SURE

  14. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part I

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    This work presents a new strategy for fault diagnosis in large chemical processes (E.E. Tarifa, Fault diagnosis in complex chemistries plants: plants of large dimensions and batch processes. Ph.D. thesis, Universidad Nacional del Litoral, Santa Fe, 1995). A special decomposition of the plant is made in sectors. Afterwards each sector is studied independently. These steps are carried out in the off-line mode. They produced vital information for the diagnosis system. This system works in the on-line mode and is based on a two-tier strategy. When a fault is produced, the upper level identifies the faulty sector. Then, the lower level carries out an in-depth study that focuses only on the critical sectors to identify the fault. The loss of information produced by the process partition may cause spurious diagnosis. This problem is overcome at the second level using qualitative simulation and fuzzy logic. In the second part of this work, the new methodology is tested to evaluate its performance in practical cases. A multiple stage flash desalination system (MSF) is chosen because it is a complex system, with many recycles and variables to be supervised. The steps for the knowledge base generation and all the blocks included in the diagnosis system are analyzed. Evaluation of the diagnosis performance is carried out using a rigorous dynamic simulator

  15. Improving timeliness and efficiency in the referral process for safety net providers: application of the Lean Six Sigma methodology.

    Science.gov (United States)

    Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A

    2010-01-01

    Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.

  16. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  17. Methodology of shell structure reinforcement layout optimization

    Science.gov (United States)

    Szafrański, Tomasz; Małachowski, Jerzy; Damaziak, Krzysztof

    2018-01-01

    This paper presents an optimization process of a reinforced shell diffuser intended for a small wind turbine (rated power of 3 kW). The diffuser structure consists of multiple reinforcement and metal skin. This kind of structure is suitable for optimization in terms of selection of reinforcement density, stringers cross sections, sheet thickness, etc. The optimisation approach assumes the reduction of the amount of work to be done between the optimization process and the final product design. The proposed optimization methodology is based on application of a genetic algorithm to generate the optimal reinforcement layout. The obtained results are the basis for modifying the existing Small Wind Turbine (SWT) design.

  18. Abstracts, Third Space Processing Symposium, Skylab results

    Science.gov (United States)

    1974-01-01

    Skylab experiments results are reported in abstracts of papers presented at the Third Space Processing Symposium. Specific areas of interest include: exothermic brazing, metals melting, crystals, reinforced composites, glasses, eutectics; physics of the low-g processes; electrophoresis, heat flow, and convection demonstrations flown on Apollo missions; and apparatus for containerless processing, heating, cooling, and containing materials.

  19. A multicriteria-based methodology for site prioritisation in sediment management.

    Science.gov (United States)

    Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos

    2009-08-01

    Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.

  20. Features of the Manufacturing Vision Development Process

    DEFF Research Database (Denmark)

    Dukovska-Popovska, Iskra; Riis, Jens Ove; Boer, Harry

    2005-01-01

    of action research. The methodology recommends wide participation of people from different hierarchical and functional positions, who engage in a relatively short, playful and creative process and come up with a vision (concept) for the future manufacturing system in the company. Based on three case studies......This paper discusses the key features of the process of Manufacturing Vision Development, a process that enables companies to develop their future manufacturing concept. The basis for the process is a generic five-phase methodology (Riis and Johansen, 2003) developed as a result of ten years...... of companies going through the initial phases of the methodology, this research identified the key features of the Manufacturing Vision Development process. The paper elaborates the key features by defining them, discussing how and when they can appear, and how they influence the process....

  1. A methodology for development of biocatalytic processes

    OpenAIRE

    Lima Ramos, Joana; Woodley, John; Tufvesson, Pär

    2013-01-01

    The potential advantages displayed by biocatalytic processes for organic synthesis (such as exquisite selectivity under mild operating conditions), have prompted the increasing number of processes running on a commercial scale. However, biocatalysis is still a fairly underutilised technology. As a relatively new technology biocatalytic processes often do not immediately fulfil the required process metrics that are key for an economically and/or environmentally competitive process at an indust...

  2. ALTERNATIVE METHODOLOGIES FOR THE ESTIMATION OF LOCAL POINT DENSITY INDEX: MOVING TOWARDS ADAPTIVE LIDAR DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Z. Lari

    2012-07-01

    Full Text Available Over the past few years, LiDAR systems have been established as a leading technology for the acquisition of high density point clouds over physical surfaces. These point clouds will be processed for the extraction of geo-spatial information. Local point density is one of the most important properties of the point cloud that highly affects the performance of data processing techniques and the quality of extracted information from these data. Therefore, it is necessary to define a standard methodology for the estimation of local point density indices to be considered for the precise processing of LiDAR data. Current definitions of local point density indices, which only consider the 2D neighbourhood of individual points, are not appropriate for 3D LiDAR data and cannot be applied for laser scans from different platforms. In order to resolve the drawbacks of these methods, this paper proposes several approaches for the estimation of the local point density index which take the 3D relationship among the points and the physical properties of the surfaces they belong to into account. In the simplest approach, an approximate value of the local point density for each point is defined while considering the 3D relationship among the points. In the other approaches, the local point density is estimated by considering the 3D neighbourhood of the point in question and the physical properties of the surface which encloses this point. The physical properties of the surfaces enclosing the LiDAR points are assessed through eigen-value analysis of the 3D neighbourhood of individual points and adaptive cylinder methods. This paper will discuss these approaches and highlight their impact on various LiDAR data processing activities (i.e., neighbourhood definition, region growing, segmentation, boundary detection, and classification. Experimental results from airborne and terrestrial LiDAR data verify the efficacy of considering local point density variation for

  3. TEACHING AND LEARNING METHODOLOGIES SUPPORTED BY ICT APPLIED IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Jose CAPACHO

    2016-04-01

    Full Text Available The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory. Genetic-Cognitive Psychology Theory and Dialectics Psychology. Based on the theoretical framework the following methodologies were developed: Game Theory, Constructivist Approach, Personalized Teaching, Problem Solving, Cooperative Collaborative learning, Learning projects using ICT. These methodologies were applied to the teaching learning process during the Algorithms and Complexity – A&C course, which belongs to the area of ​​Computer Science. The course develops the concepts of Computers, Complexity and Intractability, Recurrence Equations, Divide and Conquer, Greedy Algorithms, Dynamic Programming, Shortest Path Problem and Graph Theory. The main value of the research is the theoretical support of the methodologies and their application supported by ICT using learning objects. The course aforementioned was built on the Blackboard platform evaluating the operation of methodologies. The results of the evaluation are presented for each of them, showing the learning outcomes achieved by students, which verifies that methodologies are functional.

  4. Core design methodology and software for Temelin NPP

    International Nuclear Information System (INIS)

    Havluj, F; Hejzlar, J.; Klouzal, J.; Stary, V.; Vocka, R.

    2011-01-01

    In the frame of the process of fuel vendor change at Temelin NPP in the Czech Republic, where, starting since 2010, TVEL TVSA-T fuel is loaded instead of Westinghouse VVANTAGE-6 fuel, new methodologies for core design and core reload safety evaluation have been developed. These documents are based on the methodologies delivered by TVEL within the fuel contract, and they were further adapted according to Temelin NPP operational needs and according to the current practice at NPP. Along with the methodology development the 3D core analysis code ANDREA, licensed for core reload safety evaluation in 2010, have been upgraded in order to optimize the safety evaluation process. New sequences of calculations were implemented in order to simplify the evaluation of different limiting parameters and output visualization tools were developed to make the verification process user friendly. Interfaces to the fuel performance code TRANSURANUS and sub-channel analysis code SUBCAL were developed as well. (authors)

  5. Using Modern Methodologies with Maintenance Software

    Science.gov (United States)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  6. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  7. Some Findings Concerning Requirements in Agile Methodologies

    Science.gov (United States)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  8. Anthropogenic microfibres pollution in marine biota. A new and simple methodology to minimize airborne contamination.

    Science.gov (United States)

    Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi

    2016-12-15

    Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  10. Establishment of Assessment Methodology Improvement of IAEA INPRO Proliferation Resistance

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. H.; Yang, M. S.; Song, K. C.; Ko, W. I.; Kim, H. D.; Kim, Y. I.; Rhee, B. W.; Kim, H. T.

    2008-03-15

    For the development of assessment methodology of acquisition and diversion pathway of nuclear material, the PR assessment methodology which has been developed by GEN-IV PR and PP group was reviewed regarding the acquisition and diversion pathway of nuclear material and we proposed the research areas to develop the model of acquisition and diversion pathway of nuclear material including misuse of fuel cycle facilities and one of the IAEA INPRO CRPs which is aiming to develop its model. From the present study, its preliminary model for acquisition and diversion pathway of nuclear material was obtained. For preliminary evaluation of DUPIC system using methodology of acquisition/diversion pathway of nm and review of pyro-processing system characteristics, the research direction and work procedure was established to develop the assessment methodology of User Requirement 4 of INPRO PR by; 1) selection of the possible pathway to acquire and divert the nuclear material of DUPIC system, 2) the analysis of selected pathway, 3) the development of the assessment methodology of robustness and multiplicity of an INS. And, the PR characteristics and process/material flow analysis of the Pyro-processing system were preliminarily studied. For establishment of R and D direction for an INS and supporting international cooperation research, the collaborative research project titled as 'Acquisition and Diversion pathway analysis of Proliferation Resistance' as one of activities of IAEA INPRO was proposed, since Korean Government decided to actively support the IAEA INPRO. In order to review and clarify the Terms of Reference (TOR) of a Korean Proposed Collaborative Project (ROK1), two INPRO Consultancy Meetings were held. Its results were presented at two INPRO Steering Committees and the finalized TOR of Korean Proposal submit the 12-th INPRO Steering Committee Meeting which was held Dec. 3-5 2007. Four participants including USA, Canada, China and European Community (EC

  11. Establishment of Assessment Methodology Improvement of IAEA INPRO Proliferation Resistance

    International Nuclear Information System (INIS)

    Park, J. H.; Yang, M. S.; Song, K. C.; Ko, W. I.; Kim, H. D.; Kim, Y. I.; Rhee, B. W.; Kim, H. T.

    2008-03-01

    For the development of assessment methodology of acquisition and diversion pathway of nuclear material, the PR assessment methodology which has been developed by GEN-IV PR and PP group was reviewed regarding the acquisition and diversion pathway of nuclear material and we proposed the research areas to develop the model of acquisition and diversion pathway of nuclear material including misuse of fuel cycle facilities and one of the IAEA INPRO CRPs which is aiming to develop its model. From the present study, its preliminary model for acquisition and diversion pathway of nuclear material was obtained. For preliminary evaluation of DUPIC system using methodology of acquisition/diversion pathway of nm and review of pyro-processing system characteristics, the research direction and work procedure was established to develop the assessment methodology of User Requirement 4 of INPRO PR by; 1) selection of the possible pathway to acquire and divert the nuclear material of DUPIC system, 2) the analysis of selected pathway, 3) the development of the assessment methodology of robustness and multiplicity of an INS. And, the PR characteristics and process/material flow analysis of the Pyro-processing system were preliminarily studied. For establishment of R and D direction for an INS and supporting international cooperation research, the collaborative research project titled as 'Acquisition and Diversion pathway analysis of Proliferation Resistance' as one of activities of IAEA INPRO was proposed, since Korean Government decided to actively support the IAEA INPRO. In order to review and clarify the Terms of Reference (TOR) of a Korean Proposed Collaborative Project (ROK1), two INPRO Consultancy Meetings were held. Its results were presented at two INPRO Steering Committees and the finalized TOR of Korean Proposal submit the 12-th INPRO Steering Committee Meeting which was held Dec. 3-5 2007. Four participants including USA, Canada, China and European Community (EC) have decided

  12. Adding value to the learning process by online peer review activities: towards the elaboration of a methodology to promote critical thinking in future engineers

    Science.gov (United States)

    Dominguez, Caroline; Nascimento, Maria M.; Payan-Carreira, Rita; Cruz, Gonçalo; Silva, Helena; Lopes, José; Morais, Maria da Felicidade A.; Morais, Eva

    2015-09-01

    Considering the results of research on the benefits and difficulties of peer review, this paper describes how teaching faculty, interested in endorsing the acquisition of communication and critical thinking (CT) skills among engineering students, has been implementing a learning methodology throughout online peer review activities. While introducing a new methodology, it is important to weight the advantages found and the conditions that might have restrained the activity outcomes, thereby modulating its overall efficiency. Our results show that several factors are decisive for the success of the methodology: the use of specific and detailed orientation guidelines for CT skills, the students' training on how to deliver a meaningful feedback, the opportunity to counter-argument, the selection of good assignments' examples, and the constant teacher's monitoring of the activity. Results also tackle other aspects of the methodology such as the thinking skills evaluation tools (grades and tests) that most suit our reality. An improved methodology is proposed taking in account the encountered limitations, thus offering the possibility to other interested institutions to use/test and/or improve it.

  13. A new methodology for strategic planning using technological maps and detection of emerging research fronts applied to radiopharmacy

    International Nuclear Information System (INIS)

    Didio, Robert Joseph

    2011-01-01

    This research aims the development of a new methodology to support the strategic planning, using the process of elaboration of technological maps (TRM - Technological Roadmaps), associated with application of the detection process of emerging fronts of research in databases of scientific publications and patents. The innovation introduced in this research is the customization of the process of TRM to the radiopharmacy and, specifically, its association to the technique of detection of emerging fronts of research, in order to prove results and to establish a new and very useful methodology to the strategic planning of this area of businesses. The business unit DIRF - Diretoria de Radiofarmacia - of IPEN CNEN/SP was used as base of the study and implementation of this methodology presented in this work. (author)

  14. Review on Suitability of Available LCIA Methodologies for Assessing Environmental Impact of the Food Sector

    Directory of Open Access Journals (Sweden)

    Pegah Amani

    2011-12-01

    Full Text Available Production, processing, distribution, and consumption of a wide variety of products in the food sector have different ranges of environmental impacts. Methodologies used in environmental impact assessment differ in which set of impact categories is covered and which models are used to assess them. In the food sector, life cycle assessment results are mostly presented without any clear distinction of the principles applied to selecting the relevant methodology. In this paper, the most relevant life cycle impact assessment methodologies are determined from the list of recommended methodologies published recently in the international reference life cycle data system (ILCD handbook. The range of the relevant impacts covered is considered as the main indicator decisive in selecting a methodology. The selection of the relevant set of impact categories is performed through an overview of more than 50 recent LCA case studies of different products in the sector. The result of the research is a short list of three LCIA methodologies recommended to be used for environmental impact assessment of products in the food sector.

  15. Optimisation on processing parameters for minimising warpage on side arm using response surface methodology (RSM) and particle swarm optimisation (PSO)

    Science.gov (United States)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.; Sazli, M.; Yahya, Z. R.

    2017-09-01

    This study presents the application of optimisation method to reduce the warpage of side arm part. Autodesk Moldflow Insight software was integrated into this study to analyse the warpage. The design of Experiment (DOE) for Response Surface Methodology (RSM) was constructed and by using the equation from RSM, Particle Swarm Optimisation (PSO) was applied. The optimisation method will result in optimised processing parameters with minimum warpage. Mould temperature, melt temperature, packing pressure, packing time and cooling time was selected as the variable parameters. Parameters selection was based on most significant factor affecting warpage stated by previous researchers. The results show that warpage was improved by 28.16% for RSM and 28.17% for PSO. The warpage improvement in PSO from RSM is only by 0.01 %. Thus, the optimisation using RSM is already efficient to give the best combination parameters and optimum warpage value for side arm part. The most significant parameters affecting warpage are packing pressure.

  16. Action research methodology in clinical pharmacy

    DEFF Research Database (Denmark)

    Nørgaard, Lotte Stig; Sørensen, Ellen Westh

    2016-01-01

    Introduction The focus in clinical pharmacy practice is and has for the last 30-35 years been on changing the role of pharmacy staff into service orientation and patient counselling. One way of doing this is by involving staff in change process and as a researcher to take part in the change process...... by establishing partnerships with staff. On the background of the authors' widespread action research (AR)-based experiences, recommendations and comments for how to conduct an AR-study is described, and one of their AR-based studies illustrate the methodology and the research methods used. Methodology AR...... is defined as an approach to research which is based on a problem-solving relationship between researchers and clients, which aims at both solving a problem and at collaboratively generating new knowledge. Research questions relevant in AR-studies are: what was the working process in this change oriented...

  17. Methodology for assessment of safety risk due to potential accidents in US gaseous diffusion plants

    International Nuclear Information System (INIS)

    Turner, J.H.; O'Kain, D.U.

    1991-01-01

    Gaseous diffusion plants that operate in the United States represent a unique combination of nuclear and chemical hazards. Assessing and controlling the health, safety, and environmental risks that can result from natural phenomena events, process upset conditions, and operator errors require a unique methodology. Such a methodology has been developed for the diffusion plants and is being utilized to assess and control the risk of operating the plants. A summary of the methodology developed to assess the unique safety risks at the US gaseous diffusion plants is presented in this paper

  18. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  19. A Human-Centered Design Methodology to Enhance the Usability, Human Factors, and User Experience of Connected Health Systems: A Three-Phase Methodology

    Science.gov (United States)

    Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid

    2017-01-01

    Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the

  20. Methodology for the insecticide potential evaluation of forest species

    International Nuclear Information System (INIS)

    Morales Soto, Leon; Garcia P, Carlos Mario

    2000-01-01

    The flora diversity of Colombia has an enormous potential in the rational use of its forest resources. Trees with biocides effects to pest control and diseases need to be investigated. The objective of this research was to develop a methodology with low costs, easy application and quick results. The methodology employed was as follows: selection of tree species based on bibliography, ancestral reports and personal observations. The process was as follows: field collection of plants, preparation of plants extracts, and test with Artemia Salina, leach to detect biological activity of the extracts using LC50. Bioassays with those extract more promising (LC50 less than 1000 ppm). The methodology was employed with 5 forest tree species: Guarea guidonia (L.) Sleumer and Trichia hirta L. (Meliaceae), machaerium moritzianum Benth. (Fabaceae), Swinglea glutinosa Merrill. (Rutaceae) and Mammea americana L. (Clusiaceae). Using Artemia salina Leach as indicator of biocidal potential, two species were selected as the most promising, those were: Swinglea glutinosa Merril and Machaerium moritzianum Benth. In addition bioassays were made to evaluate fagoinhibition on Atta cephalotes (L.) and control of Alconeura. This methodology is recommended for this kind of research

  1. A framework for using simulation methodology in ergonomics interventions in design projects

    DEFF Research Database (Denmark)

    Broberg, Ole; Duarte, Francisco; Andersen, Simone Nyholm

    2014-01-01

    The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective......The aim of this paper is to outline a framework of simulation methodology in design processes from an ergonomics perspective...

  2. Methodology for stereoscopic motion-picture quality assessment

    Science.gov (United States)

    Voronov, Alexander; Vatolin, Dmitriy; Sumin, Denis; Napadovsky, Vyacheslav; Borisov, Alexey

    2013-03-01

    Creating and processing stereoscopic video imposes additional quality requirements related to view synchronization. In this work we propose a set of algorithms for detecting typical stereoscopic-video problems, which appear owing to imprecise setup of capture equipment or incorrect postprocessing. We developed a methodology for analyzing the quality of S3D motion pictures and for revealing their most problematic scenes. We then processed 10 modern stereo films, including Avatar, Resident Evil: Afterlife and Hugo, and analyzed changes in S3D-film quality over the years. This work presents real examples of common artifacts (color and sharpness mismatch, vertical disparity and excessive horizontal disparity) in the motion pictures we processed, as well as possible solutions for each problem. Our results enable improved quality assessment during the filming and postproduction stages.

  3. Applicability of product-driven process synthesis to separation processes in food

    NARCIS (Netherlands)

    Jankowiak, L.; Goot, van der A.J.; Trifunovic, O.; Bongers, P.; Boom, R.M.

    2012-01-01

    The demand for more sustainable processing in the food industry is rising but requires structured methodologies to support the fast implementation of new economic and sustainable processes. Product-driven process synthesis (PDPS) is a recently established methodology facilitating the rapid

  4. Measuring Instruments Control Methodology Performance for Analog Electronics Remote Labs

    Directory of Open Access Journals (Sweden)

    Unai Hernandez-Jayo

    2012-12-01

    Full Text Available This paper presents the work that has been developed in parallel to the VISIR project. The objective of this paper is to present the results of the validations processes that have been carried out to check the control methodology. This method has been developed with the aim of being independent of the instruments of the labs.

  5. Implementation impacts of PRL methodology

    International Nuclear Information System (INIS)

    Caudill, J.A.; Krupa, J.F.; Meadors, R.E.; Odum, J.V.; Rodrigues, G.C.

    1993-02-01

    This report responds to a DOE-SR request to evaluate the impacts from implementation of the proposed Plutonium Recovery Limit (PRL) methodology. The PRL Methodology is based on cost minimization for decisions to discard or recover plutonium contained in scrap, residues, and other plutonium bearing materials. Implementation of the PRL methodology may result in decisions to declare as waste certain plutonium bearing materials originally considered to be a recoverable plutonium product. Such decisions may have regulatory impacts, because any material declared to be waste would immediately be subject to provisions of the Resource Conservation and Recovery Act (RCRA). The decision to discard these materials will have impacts on waste storage, treatment, and disposal facilities. Current plans for the de-inventory of plutonium processing facilities have identified certain materials as candidates for discard based upon the economic considerations associated with extending the operating schedules for recovery of the contained plutonium versus potential waste disposal costs. This report evaluates the impacts of discarding those materials as proposed by the F Area De-Inventory Plan and compares the De-Inventory Plan assessments with conclusions from application of the PRL. The impact analysis was performed for those materials proposed as potential candidates for discard by the De-Inventory Plan. The De-Inventory Plan identified 433 items, containing approximately 1% of the current SRS Pu-239 inventory, as not appropriate for recovery as the site moves to complete the mission of F-Canyon and FB-Line. The materials were entered into storage awaiting recovery as product under the Department's previous Economic Discard Limit (EDL) methodology which valued plutonium at its incremental cost of production in reactors. An application of Departmental PRLs to the subject 433 items revealed that approximately 40% of them would continue to be potentially recoverable as product plutonium

  6. A methodology for Electric Power Load Forecasting

    Directory of Open Access Journals (Sweden)

    Eisa Almeshaiei

    2011-06-01

    Full Text Available Electricity demand forecasting is a central and integral process for planning periodical operations and facility expansion in the electricity sector. Demand pattern is almost very complex due to the deregulation of energy markets. Therefore, finding an appropriate forecasting model for a specific electricity network is not an easy task. Although many forecasting methods were developed, none can be generalized for all demand patterns. Therefore, this paper presents a pragmatic methodology that can be used as a guide to construct Electric Power Load Forecasting models. This methodology is mainly based on decomposition and segmentation of the load time series. Several statistical analyses are involved to study the load features and forecasting precision such as moving average and probability plots of load noise. Real daily load data from Kuwaiti electric network are used as a case study. Some results are reported to guide forecasting future needs of this network.

  7. Methodology and analysis of production safety during Pu recycling at SSC RF RIAR

    International Nuclear Information System (INIS)

    Kirillovich, A.P.

    2000-01-01

    The methodology and criteria for estimating safety in technological processes of the nuclear fuel cycle (NFC) are proposed, substantiated and verified during the large-scale Pu recycling (500 kg). The comprehensive investigation results of the radiation-ecological situation are presented during pilot production of the mixed uranium-plutonium fuel and fuel assembly at SSC RF RIAR. The methodology and experimental data bank can be used while estimating safety in the industrial recycling of Pu and minor-actinides (Np, Am, Cm) in NFC. (author)

  8. The 5S methodology as a tool for improving the organisation

    OpenAIRE

    J. Michalska; D. Szewieczek

    2007-01-01

    Purpose: The aim of this paper is showing the 5S methodology. In this paper it was introduced the way of implementing the 5S methodology in the company.Design/methodology/approach: In the frames of own research it has been analysed and implemented the 5S rules in the production process.Findings: On the basis of the own research it can be stated, that introducing the 5S rules bring the great changes in the company, for example: process improvement by costs’ reduction, increasing of effectivene...

  9. Key Features of the Manufacturing Vision Development Process

    DEFF Research Database (Denmark)

    Dukovska-Popovska, Iskra; Riis, Jens Ove; Boer, Harry

    2005-01-01

    of action research. The methodology recommends wide participation of people from different hierarchical and functional positions, who engage in a relatively short, playful and creative process and come up with a vision (concept) for the future manufacturing system in the company. Based on three case studies......This paper discusses the key features of the process of Manufacturing Vision Development, a process that enables companies to develop their future manufacturing concept. The basis for the process is a generic five-phase methodology (Riis and Johansen 2003) developed as a result of ten years...... of companies going through the initial phases of the methodology, this research identified the key features of the Manufacturing Vision Development process. The paper elaborates the key features by defining them, discussing how and when they can appear, and how they influence the process....

  10. Production of Heat Sensitive Monoacylglycerols by Enzymatic Glycerolysis in Tert-pentanol: Process Optimization by Response Surface Methodology

    DEFF Research Database (Denmark)

    Damstrup, Marianne L.; Jensen, Tine; Sparsø, Flemming V.

    2006-01-01

    The aim of this study was to optimize production of MAG by lipase-catalyzed glycerolysis in a tert-pentanol system. Twenty-nine batch reactions consisting of glycerol, sunflower oil, tert-pentanol, and commercially available lipase (Novozym®435) were carried out, with four process parameters being...... varied: Enzyme load, reaction time, substrate ratio of glycerol to oil, and solvent amount. Response surface methodology was applied to optimize the reaction system based on the experimental data achieved. MAG, DAG, and TAG contents, measured after a selected reaction time, were used as model responses....... Well-fitting quadratic models were obtained for MAG, DAG, and TAG contents as a function of the process parameters with determination coefficients (R2) of 0.89, 0.88, and 0.92, respectively. Of the main effects examined, only enzyme load and reaction time significantly influenced MAG, DAG, and TAG...

  11. METHODOLOGY OF DEVELOPMENT OF LINGUOSOCIOCULTURAL COMPETENCE OF THE FUTURE ENGLISH LANGUAGE TEACHERS IN THE PROCESS OF READING FICTION

    Directory of Open Access Journals (Sweden)

    Мар’яна Нацюк

    2014-10-01

    Full Text Available The articles specifies the methodology of linguosociocultural competence of the future English language teachers in the process of reading fiction. The stages and the aim of each stage of linguosociocultural competence development have been distinguished, the demands to the exercises and the typology of the exercises have been defined. The system of exercises for linguosociocultural competence development which consists of subsystem of sociocultural, sociolinguistic, social competences development and group of exercises for sociocultural, sociolinguistic, social knowledge and skills development has been suggested

  12. Community dialogues for child health: results from a qualitative process evaluation in three countries.

    Science.gov (United States)

    Martin, Sandrine; Leitão, Jordana; Muhangi, Denis; Nuwa, Anthony; Magul, Dieterio; Counihan, Helen

    2017-06-05

    Across the developing world, countries are increasingly adopting the integrated community case management of childhood illnesses (iCCM) strategy in efforts to reduce child mortality. This intervention's effectiveness is dependent on community adoption and changes in care-seeking practices. We assessed the implementation process of a theory-driven community dialogue (CD) intervention specifically designed to strengthen the support and uptake of the newly introduced iCCM services and related behaviours in three African countries. A qualitative process evaluation methodology was chosen and used secondary project data and primary data collected in two districts of each of the three countries, in purposefully sampled communities. The final data set included 67 focus group discussions and 57 key informant interviews, totalling 642 respondents, including caregivers, CD facilitators community leaders, and trainers. Thematic analysis of the data followed the 'Framework Approach' utilising both a deduction and induction process. Results show that CDs contribute to triggering community uptake of and support for iCCM services through filling health information gaps and building cooperation within communities. We found it to be an effective approach for addressing social norms around child care practices. This approach was embraced by communities for its flexibility and value in planning individual and collective change. Regular CDs can contribute to the formation of new habits, particularly in relation to seeking timely care in case of child sickness. This study also confirms the value of process evaluation to unwrap the mechanisms of community mobilisation approaches in context and provides key insights for improving the CD approach.

  13. Problem solving using soft systems methodology.

    Science.gov (United States)

    Land, L

    This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.

  14. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  15. A methodology to calibrate water saturation estimated from 4D seismic data

    International Nuclear Information System (INIS)

    Davolio, Alessandra; Maschio, Célio; José Schiozer, Denis

    2014-01-01

    Time-lapse seismic data can be used to estimate saturation changes within a reservoir, which is valuable information for reservoir management as it plays an important role in updating reservoir simulation models. The process of updating reservoir properties, history matching, can incorporate estimated saturation changes qualitatively or quantitatively. For quantitative approaches, reliable information from 4D seismic data is important. This work proposes a methodology to calibrate the volume of water in the estimated saturation maps, as these maps can be wrongly estimated due to problems with seismic signals (such as noise, errors associated with data processing and resolution issues). The idea is to condition the 4D seismic data to known information provided by engineering, in this case the known amount of injected and produced water in the field. The application of the proposed methodology in an inversion process (previously published) that estimates saturation from 4D seismic data is presented, followed by a discussion concerning the use of such data in a history matching process. The methodology is applied to a synthetic dataset to validate the results, the main of which are: (1) reduction of the effects of noise and errors in the estimated saturation, yielding more reliable data to be used quantitatively or qualitatively and (2) an improvement in the properties update after using this data in a history matching procedure. (paper)

  16. Knowledge Management Audit - a methodology and case study

    Directory of Open Access Journals (Sweden)

    Thomas Lauer

    2001-11-01

    Full Text Available The strategic importance of knowledge in today’s organisation has been discussed extensively and research has looked at various issues in developing knowledge management systems. Both the characterisation of knowledge and alternate models for understanding the acquisition and use of such knowledge have taken on significant prominence. This is due to the complexities associated with acquiring and representing knowledge, and the varied nature of its use in knowledge work. However, the role of the knowledge workers and the processes that guide their knowledge work as they meet the knowledge goals of an organisation have received little attention. This paper proposes a knowledge audit (an assessment of the way knowledge processes meet an organisation’s knowledge goals methodology to understand the “gaps” in the needs of a knowledge worker before one develops KM systems. The methodology also uses “process change” research to help build a socio-technical environment critical for knowledge work. The audit methodology is applied to a particular case and the implementation of the audit recommendations is discussed. Future implications of such an audit are also discussed.

  17. A framework for assessing the adequacy and effectiveness of software development methodologies

    Science.gov (United States)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  18. A case on vendor selection methodology: An integrated approach

    Directory of Open Access Journals (Sweden)

    Nikhil C. Shil

    2009-11-01

    Full Text Available Vendor selection methodology is a highly researched area in supply chain management l terature and a very significant decision taken by supply chain managers due to technological advances in the manufacturing process. Such research has two basic dimensions: one is related to the identification of variables affecting the performance of the vendors and the other deals with the methodology to be applied. Most of the research conducted in this area deal with the upfront selection of vendors. However, it is very common to have a list of dedicated vendors due to the development of sophisticated production technologies like just in time (JIT, a lean or agile manufacturing process where continuous flow of materials is a requirement. This paper addresses the issue of selecting the optimal vendor from the internal database of a company. Factor analysis, analytical hierarchy process and regression analysis is used in an integrated way to supplement the vendor selection process. The methodology presented here is simply a proposal where every possible room for adjustment is available.

  19. Development of a Novel Gas Pressurized Process-Based Technology for CO2 Capture from Post-Combustion Flue Gases Preliminary Year 1 Techno-Economic Study Results and Methodology for Gas Pressurized Stripping Process

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Shiaoguo

    2013-03-01

    Under the DOE’s Innovations for Existing Plants (IEP) Program, Carbon Capture Scientific, LLC (CCS) is developing a novel gas pressurized stripping (GPS) process to enable efficient post-combustion carbon capture (PCC) from coal-fired power plants. A technology and economic feasibility study is required as a deliverable in the project Statement of Project Objectives. This study analyzes a fully integrated pulverized coal power plant equipped with GPS technology for PCC, and is carried out, to the maximum extent possible, in accordance to the methodology and data provided in ATTACHMENT 3 – Basis for Technology Feasibility Study of DOE Funding Opportunity Number: DE-FOA-0000403. The DOE/NETL report on “Cost and Performance Baseline for Fossil Energy Plants, Volume 1: Bituminous Coal and Natural Gas to Electricity (Original Issue Date, May 2007), NETL Report No. DOE/NETL-2007/1281, Revision 1, August 2007” was used as the main source of reference to be followed, as per the guidelines of ATTACHMENT 3 of DE-FOA-0000403. The DOE/NETL-2007/1281 study compared the feasibility of various combinations of power plant/CO2 capture process arrangements. The report contained a comprehensive set of design basis and economic evaluation assumptions and criteria, which are used as the main reference points for the purpose of this study. Specifically, Nexant adopted the design and economic evaluation basis from Case 12 of the above-mentioned DOE/NETL report. This case corresponds to a nominal 550 MWe (net), supercritical greenfield PC plant that utilizes an advanced MEAbased absorption system for CO2 capture and compression. For this techno-economic study, CCS’ GPS process replaces the MEA-based CO2 absorption system used in the original case. The objective of this study is to assess the performance of a full-scale GPS-based PCC design that is integrated with a supercritical PC plant similar to Case 12 of the DOE/NETL report, such that it corresponds to a nominal 550 MWe

  20. Covariance Evaluation Methodology for Neutron Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Arcilla, R.; Mattoon, C.M.; Mughabghab, S.F.; Oblozinsky, P.; Pigni, M.; Pritychenko, b.; Songzoni, A.A.

    2008-09-01

    We present the NNDC-BNL methodology for estimating neutron cross section covariances in thermal, resolved resonance, unresolved resonance and fast neutron regions. The three key elements of the methodology are Atlas of Neutron Resonances, nuclear reaction code EMPIRE, and the Bayesian code implementing Kalman filter concept. The covariance data processing, visualization and distribution capabilities are integral components of the NNDC methodology. We illustrate its application on examples including relatively detailed evaluation of covariances for two individual nuclei and massive production of simple covariance estimates for 307 materials. Certain peculiarities regarding evaluation of covariances for resolved resonances and the consistency between resonance parameter uncertainties and thermal cross section uncertainties are also discussed.

  1. Coupling Computer-Aided Process Simulation and ...

    Science.gov (United States)

    A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable

  2. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  3. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  4. Risk assessment under deep uncertainty: A methodological comparison

    International Nuclear Information System (INIS)

    Shortridge, Julie; Aven, Terje; Guikema, Seth

    2017-01-01

    Probabilistic Risk Assessment (PRA) has proven to be an invaluable tool for evaluating risks in complex engineered systems. However, there is increasing concern that PRA may not be adequate in situations with little underlying knowledge to support probabilistic representation of uncertainties. As analysts and policy makers turn their attention to deeply uncertain hazards such as climate change, a number of alternatives to traditional PRA have been proposed. This paper systematically compares three diverse approaches for risk analysis under deep uncertainty (qualitative uncertainty factors, probability bounds, and robust decision making) in terms of their representation of uncertain quantities, analytical output, and implications for risk management. A simple example problem is used to highlight differences in the way that each method relates to the traditional risk assessment process and fundamental issues associated with risk assessment and description. We find that the implications for decision making are not necessarily consistent between approaches, and that differences in the representation of uncertain quantities and analytical output suggest contexts in which each method may be most appropriate. Finally, each methodology demonstrates how risk assessment can inform decision making in deeply uncertain contexts, informing more effective responses to risk problems characterized by deep uncertainty. - Highlights: • We compare three diverse approaches to risk assessment under deep uncertainty. • A simple example problem highlights differences in analytical process and results. • Results demonstrate how methodological choices can impact risk assessment results.

  5. How to treat a patient with chronic low back pain - methodology and results of the first international case conference of integrative medicine.

    Science.gov (United States)

    Brinkhaus, Benno; Lewith, George; Rehberg, Benno; Heusser, Peter; Cummings, Mike; Michalsen, Andreas; Teut, Michael; Willich, Stefan N; Irnich, Dominik

    2011-02-01

    Complementary and alternative medicine (CAM) is frequently used in patients in industrialised countries. Despite this popularity, there remains a considerable deficit of discourse and cooperation between physicians practicing CAM and conventional medicine. The aim is to present the methodology and results of the first international case conference on integrative medicine (IM) dealing with a patient with low back pain. In this paper the methodological tool "case conference on IM" is also described. The interactive case conference took place on November 20th, 2009 as part of the "2nd European Congress of IM" in Berlin, Germany. An experienced expert panel from both conventional medicine and CAM developed integrative medical diagnoses and therapeutic strategies using as their starting point an individual patient case on chronic low back pain (LBP). The case was selected because LBP is a common diagnosis with considerable economic impact and a problem which is often treated with CAM. In this case conference, the expert panel agreed on a diagnosis of "chronic non-specific LBP with somatic and psychological factors" and proposed multi-modal short- and long-term treatment including of CAM. The importance of the patient-physician-relationship and the consultation process with appropriate consultation time for treatment success was highlighted. There was consensus that the diagnostic process and resulting treatment plan should be individualised and focussed on the patient as a complete person, identifying the significance the disease has for the patient and not just on the disease for itself. Considerable differences were found amongst the experts regarding the first steps of treatment and each expert saw possibilities of "effective and adequate treatment" being met by their own individual treatment method. The case conference on integrative medicine stimulated an intensive exchange between the approaches used by conventional medicine and CAM clarifying different treatment

  6. Strategic knowledge management: a methodology for structuring and analysing knowledge resources

    International Nuclear Information System (INIS)

    Ricciardi, Rita Izabel

    2009-01-01

    This work presents a methodology to organize, to classify and to assess the knowledge resources of an organization. This methodology presents an innovative integration of the following elements: (a) a systemic vision of the organization; (b) a representation maps of organization strategy; (c) the identification of relevant knowledge through process analysis; (d) the reconfiguration and representation of the identified knowledge in maps; (e) a combination of critical analysis (importance and vulnerability) and of strategic analysis to assess knowledge. Such methodology was applied to the Radiopharmaceutical Center of Nuclear and Energetic Research Institute resulting in a very rich vision and understanding of the knowledge domains that are crucial to the CR. This kind of analysis has allowed a sharp perception of the knowledge problems of the Center and has also made visible the needed connections between Strategic Management and Knowledge Management. (author)

  7. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  8. User-inspired design methodology using Affordance Structure Matrix (ASM for construction projects

    Directory of Open Access Journals (Sweden)

    Maheswari J. Uma

    2017-01-01

    Full Text Available Traditionally, design phase of construction projects is often performed with incomplete and inaccurate user preferences. This is due to inefficiencies in the methodologies used for capturing the user requirements that can subsequently lead to inconsistencies and result in non-optimised end-result. Iterations and subsequent reworks due to such design inefficiencies is one of the major reasons for unsuccessful project delivery as they impact project performance measures such as time and cost among others. The existing design theories and practice are primarily based on functional requirements. Function-based design deals with design of artifact alone, which may yield favourable or unfavourable consequences with the design artifact. However, incorporating other interactions such as interactions between user & designer is necessary for optimised end-result. Hence, the objective of this research work is to devise a systematic design methodology considering all the three interactions among users, designers and artefacts for improved design efficiency. In this study, it has been attempted to apply the theory of affordances in a case project that involves the design of an offshore facility. A step-by-step methodology for developing Affordance Structure Matrix (ASM, which integrates House of Quality (HOQ and Design Structure Matrix (DSM, is proposed that can effectively capture the user requirements. HOQ is a popular quality management tool for capturing client requirements and DSM is a matrix-based tool that can capture the interdependency among the design entities. The proposed methodology utilises the strengths of both the tools, as DSM compliments HOQ in the process. In this methodology, different affordances such as AUA (Artifact-User-Affordance, AAA (Artifact-Artifact-Affordance and DDA (Designer-Designer-Affordance are captured systematically. Affordance is considered to be user-driven in this context that is in contrast to prevailing design

  9. CIM5 Phase III base process development results

    International Nuclear Information System (INIS)

    Witt, D.C.

    2000-01-01

    Integrated Demonstration Runs for the Am/Cm vitrification process were initiated in the Coupled 5-inch Cylindrical Induction Melter (CIM5) on 11/30/98 and completed on 12/9/98. Four successful runs at 60 wt% lanthanide loading were completed which met or exceeded all established criteria. The operating parameters used in these runs established the base conditions for the 5-inch Cylindrical Induction Melter (CIM5) process and were summarized in the 5-inch CIM design basis, SRT-AMC-99-OO01. (1) In subsequent tests, a total of fourteen CIM5 runs were performed using various power inputs, ramp rates and target temperatures to define the preferred processing conditions (2) Process stability and process flexibility were the key criteria used in assessing the results for each run. A preferred set of operating parameters was defined for the CIM5 batch process and these conditions were used to generate a pre-programmed, automatic processing cycle that was used for the last six CIM.5 runs (3) These operational tests were successfully completed in the January-February time frame and were summarized in SRT-AMC-99-00584. The recommended set of operating conditions defined in Runs No.1 through No.14 was used as the starting point for further pilot system runs to determine the robustness of the process, evaluate a bubbler, and investigate off-normal conditions. CIM5 Phase III Runs No.15 through No.60 were conducted utilizing the pre-programmed, automatic processing cycle to investigate system performance. This report summarizes the results of these tests and provides a recommendation for the base process as well as a processing modification for minimizing volume expansions if americium and/or curium are subject to a thermal reduction reaction like cerium. This document summarizes the results of the base process development tests conducted in the Am/Cm Pilot Facility located in Building 672-T

  10. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  11. New design methods for computer aided architecturald design methodology teaching

    NARCIS (Netherlands)

    Achten, H.H.

    2003-01-01

    Architects and architectural students are exploring new ways of design using Computer Aided Architectural Design software. This exploration is seldom backed up from a design methodological viewpoint. In this paper, a design methodological framework for reflection on innovate design processes by

  12. BAT methodology applied to the construction of new CCNN

    International Nuclear Information System (INIS)

    Vilches Rodriguez, E.; Campos Feito, O.; Gonzalez Delgado, J.

    2012-01-01

    The BAT methodology should be used in all phases of the project, from preliminary studies and design to decommissioning, gaining special importance in radioactive waste management and environmental impact studies. Adequate knowledge of this methodology will streamline the decision making process and to facilitate the relationship with regulators and stake holders.

  13. Methodology for qualitative content analysis with the technique of mind maps using Nvivo and FreeMind softwares

    Directory of Open Access Journals (Sweden)

    José Leonardo Oliveira Lima

    2016-12-01

    Full Text Available Introduction: In a survey it is not enough choosing tools, resources and procedures. It is important to understand the method beyond the technics and their relationship with philosophy, epistemology and methodology. Objective: To discuss theoretical and methodological concerns on Qualitative Research in Information Science and the process of Qualitative Content Analysis (QCA at User Studies field and to show a followed path of QCA integrated with Mind Maps technic for developing categories and indicators, by using Qualitative Data Analysis Software (QDAS and Mind Maps designing tools. Methodology: The research was descriptive, methodological, bibliographical and fieldwork conducted with open interviews that were processed using the QCA method with the support of QDAS Nvivo and FreeMind Software for Mind Map design. Results: It is shown the theory of qualitative research and QCA and a methodological path of QCA by using techniques and software mentioned above. Conclusions: When it comes to qualitative researches, the theoretical framework suggests the need of more dialogue among Information Science and other disciplines. The process of QCA evidenced: a viable path that might help further related investigations using the QDAS; the contribution of Mind Maps and their design softwares to develop the indicators and categories of QCA.

  14. A methodology for extending domain coverage in SemRep.

    Science.gov (United States)

    Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C

    2013-12-01

    We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.

  15. Introducing an ILS methodology into research reactors

    International Nuclear Information System (INIS)

    Lorenzo, N. de; Borsani, R.C.

    2003-01-01

    Integrated Logistics Support (ILS) is the managerial organisation that co-ordinates the activities of many disciplines to develop the supporting resources (training, staffing, designing aids, equipment removal routes, etc) required by technologically complex systems. The application of an ILS methodology in defence projects is described in several places, but it is infrequently illustrated for other areas; therefore the present paper deals with applying this approach to research reactors under design or already in operation. Although better results are obtained when applied since the very beginning of a project, it can be applied successfully in facilities already in operation to improve their capability in a cost-effective way. In applying this methodology, the key objectives shall be previously identified in order to tailor the whole approach. Generally in high power multipurpose reactors, obtaining maximum profit at the lowest possible cost without reducing the safety levels are key issues, while in others the goal is to minimise drawbacks like spurious shutdowns, low quality experimental results or even to reduce staff dose to ALARA values. These items need to be quantified for establishing a system status base line in order to trace the process evolution. Thereafter, specific logistics analyses should be performed in the different areas composing the system. RAMS (Reliability, Availability, Maintainability and Supportability), Manning, Training Needs, Supplying Needs are some examples of these special logistic assessments. The following paragraphs summarise the different areas, encompassed by this ILS methodology. Plant design is influenced focussing the designers? attention on the objectives already identified. Careful design reviews are performed only in an early design stage, being useless a later application. In this paper is presented a methodology including appropriate tools for ensuring the designers abide to ILS issues and key objectives through the

  16. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  17. Application of the methodology for improving the business processes for the compamy for Airport services TAV Airports Holding, Macedonia

    OpenAIRE

    Mitreva, Elizabeta; Taskov, Nako; Lazarovski, Zlatko

    2015-01-01

    In this paper we will make a full diagnosis of some business processes in the company for Airport services TAV Airports Holding, Macedonia. Based on the analysis we have made on the existing quality system an appropriate methodology is designed for each feature of the TQM (Total Quality Management) system in order to find the optimal solution for smooth operation of the airport traffic, in order to meet the wishes and needs of the customer, while the company makes a profit. The methodol...

  18. Definition of a shortcut methodology for assessing flood-related Na-Tech risk

    Directory of Open Access Journals (Sweden)

    E. Marzo

    2012-11-01

    Full Text Available In this paper a qualitative methodology for the initial assessment of flood-related Na-Tech risk was developed as a screening tool to identify which situations require a much more expensive quantitative risk analysis (QRA. Through the definition of some suitable key hazard indicators (KHIs, the proposed methodology allows the identification of the Na-Tech risk level associated with a given situation; the analytical hierarchy process (AHP was used as a multi-criteria decision tool for the evaluation of such qualitative KHIs. The developed methodology was validated through two case studies by comparing the predicted risk levels with the results of much more detailed QRAs previously presented in literature and then applied to the real flood happened at Spolana a.s., Neratovice, Czech Republic in August 2002.

  19. Assessment methodology applicable to safe decommissioning of Romanian VVR-S research reactor

    International Nuclear Information System (INIS)

    Baniu, O.; Vladescu, G.; Vidican, D.; Penescu, M.

    2002-01-01

    The paper contains the results of research activity performed by CITON specialists regarding the assessment methodology intended to be applied to safe decommissioning of the research reactors, developed taking into account specific conditions of the Romanian VVR-S Research Reactor. The Romanian VVR-S Research Reactor is an old reactor (1957) and its Decommissioning Plan is under study. The main topics of paper are as follows: Safety approach of nuclear facilities decommissioning. Applicable safety principles; Main steps of the proposed assessment methodology; Generic content of Decommissioning Plan. Main decommissioning activities. Discussion about the proposed Decommissioning Plan for Romanian Research Reactor; Safety risks which may occur during decommissioning activities. Normal decommissioning operations. Fault conditions. Internal and external hazards; Typical development of a scenario. Features, Events and Processes List. Exposure pathways. Calculation methodology. (author)

  20. Language barriers and qualitative nursing research: methodological considerations.

    Science.gov (United States)

    Squires, A

    2008-09-01

    This review of the literature synthesizes methodological recommendations for the use of translators and interpreters in cross-language qualitative research. Cross-language qualitative research involves the use of interpreters and translators to mediate a language barrier between researchers and participants. Qualitative nurse researchers successfully address language barriers between themselves and their participants when they systematically plan for how they will use interpreters and translators throughout the research process. Experienced qualitative researchers recognize that translators can generate qualitative data through translation processes and by participating in data analysis. Failure to address language barriers and the methodological challenges they present threatens the credibility, transferability, dependability and confirmability of cross-language qualitative nursing research. Through a synthesis of the cross-language qualitative methods literature, this article reviews the basics of language competence, translator and interpreter qualifications, and roles for each kind of qualitative research approach. Methodological and ethical considerations are also provided. By systematically addressing the methodological challenges cross-language research presents, nurse researchers can produce better evidence for nursing practice and policy making when working across different language groups. Findings from qualitative studies will also accurately represent the experiences of the participants without concern that the meaning was lost in translation.

  1. Methodology for Analysing Energy Demand in Biogas Production Plants—A Comparative Study of Two Biogas Plants

    Directory of Open Access Journals (Sweden)

    Emma Lindkvist

    2017-11-01

    Full Text Available Biogas production through anaerobic digestion may play an important role in a circular economy because of the opportunity to produce a renewable fuel from organic waste. However, the production of biogas may require energy in the form of heat and electricity. Therefore, resource-effective biogas production must consider both biological and energy performance. For the individual biogas plant to improve its energy performance, a robust methodology to analyse and evaluate the energy demand on a detailed level is needed. Moreover, to compare the energy performance of different biogas plants, a methodology with a consistent terminology, system boundary and procedure is vital. The aim of this study was to develop a methodology for analysing the energy demand in biogas plants on a detailed level. In the methodology, the energy carriers are allocated to: (1 sub-processes (e.g., pretreatment, anaerobic digestion, gas cleaning, (2 unit processes (e.g., heating, mixing, pumping, lighting and (3 a combination of these. For a thorough energy analysis, a combination of allocations is recommended. The methodology was validated by applying it to two different biogas plants. The results show that the methodology is applicable to biogas plants with different configurations of their production system.

  2. A methodology to evaluate the fatigue life of flexible pipes

    Energy Technology Data Exchange (ETDEWEB)

    Sousa, Fernando J.M. de; Sousa, Jose Renato M. de; Siqueira, Marcos Q. de; Sagrilo, Luis V.S. [Coordenacao dos Programas de Pos-graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil); Lemos, Carlos Alberto D. de [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper focus on a methodology to perform the fatigue analysis of flexible pipes. This methodology employs functions that convert forces and moments obtained in global analyses into stresses. The stresses are then processed by well-known cycle counting methods, and S-N curves evaluate the damage at several points in the pipe cross-section. Palmgren-Miner linear damage hypothesis is assumed in order to calculate the accumulated fatigue damage. A parametric study on the fatigue life of a flexible pipe employing this methodology is presented. The main points addressed in the study are the influence of friction between layers in the results, the importance of evaluating the fatigue life in various points of the pipe cross-section and the effect of different mean stress levels. The obtained results suggest that the consideration of friction effects strongly influences the fatigue life of flexible risers and these effects have to be accounted both in the global and local analyses of the riser. Moreover, mean stress effects are also significant and at least 8 equally spaced wires in each analyzed section of the riser must be considered in fatigue analyses. (author)

  3. A Methodology to Institutionalise User Experience in Provincial Government

    Directory of Open Access Journals (Sweden)

    Marco Cobus Pretorius

    2014-12-01

    Full Text Available Problems experienced with website usability can prevent users from accessing and adopting technology, such as e-Government. At present, a number of guidelines exist for e-Government website user experience (UX design; however, the effectiveness of the implementation of these guidelines depends on the expertise of the website development team and on an organisation’s understanding of UX. Despite the highlighted importance of UX, guidelines are rarely applied in South African e-Government website designs. UX guidelines cannot be implemented if there is a lack of executive support, trained staff, budget and user-centred design processes. The goal of this research is to propose and evaluate a methodology (called the “Institutionalise UX in Government (IUXG methodology” to institutionalise UX in South African Provincial Governments (SAPGs. The Western Cape Government in South Africa was used as a case study to evaluate the proposed IUXG methodology. The results show that the IUXG methodology can assist SAPGs to establish UX as standard practice and improve the UX maturity levels.

  4. Methodology for evaluating pattern transfer completeness in inkjet printing with irregular edges

    Science.gov (United States)

    Huang, Bo-Cin; Chan, Hui-Ju; Hong, Jian-Wei; Lo, Cheng-Yao

    2016-06-01

    A methodology for quantifying and qualifying pattern transfer completeness in inkjet printing through examining both pattern dimensions and pattern contour deviations from reference design is proposed, which enables scientifically identifying and evaluating inkjet-printed lines, corners, circles, ellipses, and spirals with irregular edges of bulging, necking, and unpredictable distortions resulting from different process conditions. This methodology not only avoids differences in individual perceptions of ambiguous pattern distortions but also indicates the systematic effects of mechanical stresses applied in different directions to a polymer substrate, and is effective for both optical and electrical microscopy in direct and indirect lithography or lithography-free patterning.

  5. Methodology for evaluating pattern transfer completeness in inkjet printing with irregular edges

    International Nuclear Information System (INIS)

    Huang, Bo-Cin; Chan, Hui-Ju; Lo, Cheng-Yao; Hong, Jian-Wei

    2016-01-01

    A methodology for quantifying and qualifying pattern transfer completeness in inkjet printing through examining both pattern dimensions and pattern contour deviations from reference design is proposed, which enables scientifically identifying and evaluating inkjet-printed lines, corners, circles, ellipses, and spirals with irregular edges of bulging, necking, and unpredictable distortions resulting from different process conditions. This methodology not only avoids differences in individual perceptions of ambiguous pattern distortions but also indicates the systematic effects of mechanical stresses applied in different directions to a polymer substrate, and is effective for both optical and electrical microscopy in direct and indirect lithography or lithography-free patterning. (paper)

  6. Methodology to evaluate the insecticide potential of forest tree species

    International Nuclear Information System (INIS)

    Morales Soto, Leon; Garcia P, Carlos Mario

    2000-01-01

    The flora diversity of Colombia has an enormous potential in the rational use of its forest resources. Trees with biocidal effects to control pests and diseases need to be investigated. The objective of this research was to develop a methodology with low costs, easy application and quick results. The methodology employed was as follows: selection of tree species based on bibliography, ancestral reports and personal observations. The process was as follows: field collection of plants, preparation of plants extracts and test with Artemia salina Leach to detect biological activity of the extracts using LC50. Bioassays with those extract more promising (LC50 less than 1000 ppm) Determination of active compounds. The methodology was employed with 5 forest tree species: guarea guidonia (L) Sleumer and trichia hirta L. (Meliaceae), Machaerium Moritzianum Benth. (Fabaceae), Swinglea glutinosa Merrill (rutaceae) and Mammea americana L. (Clusiaceae). Using Artemia salina Leach as indicator of biocidal potential, two species were selected as the most promising, those were: Swinglea glutinosa Merril and Machaerium moritzianum Benth. In addition bioassays were made to evaluate fagoinhibition on Atta cephalotes (L.) (Hym: Formicidae) and control of Alconeura. This methodology is recommended for this kind of research

  7. Harmonization of interests as a methodological basis of logistics

    Directory of Open Access Journals (Sweden)

    V. V. Baginova

    2015-01-01

    Full Text Available The article is devoted to the methodology of logistics. The basis of this methodology is the harmonization of interests of all participants of process of distribution. The methodology consists of a refusal from a fragmented approach to the management of merchandise, the use of categories of economic trade-offs, the open exchange of information and joint determination of the final price of the goods, based on the convergence of cost and value pricing methods and tariff setting.

  8. Towards methodological improvement in the Spanish studies

    Directory of Open Access Journals (Sweden)

    Beatriz Amante García

    2012-09-01

    Full Text Available The European Higher Education Area (EHEA has triggered many changes in the new degrees in Spanish universities, mainly in terms of methodology and assessment. However, in order to make such changes a success it is essential to have coordination within the teaching staff as well as active methodologies in use, which enhance and encourage students’ participation in all the activities carried out in the classroom. Most of all, when dealing with formative and summative evaluation, in which students become the ones responsible for their own learning process (López-Pastor, 2009; Torre, 2008. In this second issue of JOTSE we have included several teaching innovation experiences related to the above mentioned methodological and assessment changes.

  9. Methodology for Selecting Best Management Practices Integrating Multiple Stakeholders and Criteria. Part 1: Methodology

    Directory of Open Access Journals (Sweden)

    Mauricio Carvallo Aceves

    2016-02-01

    Full Text Available The implementation of stormwater Best Management Practices (BMPs could help re-establish the natural hydrological cycle of watersheds after urbanization, with each BMP presenting a different performance across a range of criteria (flood prevention, pollutant removal, etc.. Additionally, conflicting views from the relevant stakeholders may arise, resulting in a complex selection process. This paper proposes a methodology for BMP selection based on the application of multi-criteria decision aid (MCDA methods, integrating multiple stakeholder priorities and BMP combinations. First, in the problem definition, the MCDA methods, relevant criteria and design guidelines are selected. Next, information from the preliminary analysis of the watershed is used to obtain a list of relevant BMPs. The third step comprises the watershed modeling and analysis of the BMP alternatives to obtain performance values across purely objective criteria. Afterwards, a stakeholder analysis based on survey applications is carried out to obtain social performance values and criteria priorities. Then, the MCDA methods are applied to obtain the final BMP rankings. The last step considers the sensitivity analysis and rank comparisons in order to draw the final conclusions and recommendations. Future improvements to the methodology could explore inclusion of multiple objective analysis, and alternative means for obtaining social performance values.

  10. Decision-making methodology for management of hazardous waste

    International Nuclear Information System (INIS)

    Philbin, J.S.; Cranwell, R.M.

    1988-01-01

    A decision-making methodology is presented that combines systems and risk analysis techniques to evaluate hazardous waste management practices associated with DOE weapon production operations. The methodology provides a systematic approach to examining waste generation and waste handling practices in addition to the more visible disposal practices. Release-exposure scenarios for hazardous waste operations are identified and operational risk is determined. Comparisons may be made between existing and alternative waste management practices (and processes) on the basis of overall risk, cost and compliance with regulations. Managers can use this methodology to make and defend resource allocation decisions and to prioritize research needs

  11. Mechatronics methodology: 15 years of experience

    Directory of Open Access Journals (Sweden)

    Efren Gorrostieta

    2015-09-01

    Full Text Available This article presents a methodology to teach students to develop mechatronic projects. It was taught in higher education schools, in different universities in Mexico, in courses such as: Robotics, Control Systems, Mechatronic Systems, Artificial Intelligence, etc. The intention of this methodology is not only to achieve the integration of different subjects but also to accomplish synergy between them so that the final result may be the best possible in quality, time and robustness. Since its introduction into the educational area, this methodology was evaluated and modified for approximately five years, were substantial characteristics were adopted. For the next ten years, only minor alterations were carried out. Fifteen years of experience have proven that the methodology is useful not only for training but also for real projects. In this article, we first explain the methodology and its main characteristics, as well as a brief history of its teaching in different educational programs. Then, we present two cases were the methodology was successfully applied. The first project consisted in the design, construction and evaluation of a mobile robotic manipulator which aims to be used as an explosives ordnance device. In the second case, we document the results of a project assignment for robotics tasks carried out by students which were formerly taught with the methodology.

  12. Design of Sustainable Blended Products using an Integrated Methodology

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza Binti; Gernaey, Krist; Woodley, John

    2013-01-01

    This paper presents a systematic methodology for designing blended products consisting of three stages; product design, process identification and experimental verification. The product design stage is considered in this paper. The objective of this stage is to screen and select suitable chemicals...... to be used as building blocks in the mixture design, and then to propose the blend formulations that fulfill the desired product attributes. The result is a set of blends that match the constraints, the compositions, values of the target properties and information about their miscibility. The methodology has...... been applied to design several blended products. A case study on design of blended lubricants is highlighted. The objective is to identify blended products that satisfy the product attributes with at least similar or better performance compared to conventional products....

  13. Methodological development for selection of significant predictors explaining fatal road accidents.

    Science.gov (United States)

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. Published by Elsevier Ltd.

  14. Evaluating the influence of process parameters on soluble microbial products formation using response surface methodology coupled with grey relational analysis.

    Science.gov (United States)

    Xu, Juan; Sheng, Guo-Ping; Luo, Hong-Wei; Fang, Fang; Li, Wen-Wei; Zeng, Raymond J; Tong, Zhong-Hua; Yu, Han-Qing

    2011-01-01

    Soluble microbial products (SMPs) present a major part of residual chemical oxygen demand (COD) in the effluents from biological wastewater treatment systems, and the SMP formation is greatly influenced by a variety of process parameters. In this study, response surface methodology (RSM) coupled with grey relational analysis (GRA) method was used to evaluate the effects of substrate concentration, temperature, NH(4)(+)-N concentration and aeration rate on the SMP production in batch activated sludge reactors. Carbohydrates were found to be the major component of SMP, and the influential priorities of these factors were: temperature>substrate concentration > aeration rate > NH(4)(+)-N concentration. On the basis of the RSM results, the interactive effects of these factors on the SMP formation were evaluated, and the optimal operating conditions for a minimum SMP production in such a batch activated sludge system also were identified. These results provide useful information about how to control the SMP formation of activated sludge and ensure the bioreactor high-quality effluent. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Methodology for evaluation of alternative technologies applied to nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    Selvaduray, G.S.; Goldstein, M.K.; Anderson, R.N.

    1977-07-01

    An analytic methodology has been developed to compare the performance of various nuclear fuel reprocessing techniques for advanced fuel cycle applications including low proliferation risk systems. The need to identify and to compare those processes, which have the versatility to handle the variety of fuel types expected to be in use in the next century, is becoming increasingly imperative. This methodology allows processes in any stage of development to be compared and to assess the effect of changing external conditions on the process

  16. Selection of low-level radioactive waste disposal sites using screening models versus more complex methodologies

    International Nuclear Information System (INIS)

    Uslu, I.; Fields, D.E.

    1993-01-01

    The task of choosing a waste-disposal site from a set of candidate sites requires an approach capable of objectively handling many environmental variables for each site. Several computer methodologies have been developed to assist in the process of choosing a site for the disposal of low-level radioactive waste; however, most of these models are costly to apply, in terms of computer resources and the time and effort required by professional modelers, geologists, and waste-disposal experts. The authors describe how the relatively simple DRASTIC methodology (a standardized system for evaluating groundwater pollution potential using hydrogeologic settings) may be used for open-quotes pre-screeningclose quotes of sites to determine which subset of candidate sites is worthy of more detailed screening. Results of site comparisons made with DRASTIC are compared with results obtained using PRESTO-II methodology, which is representative of the more complex release-transport-human exposure methodologies. 6 refs., 1 fig., 1 tab

  17. Analysis and design of the SI-simulator software system for the VHTR-SI process by using the object-oriented analysis and object-oriented design methodology

    International Nuclear Information System (INIS)

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2008-01-01

    The SI-simulator is an application software system that simulates the dynamic behavior of the VHTR-SI process by the use of mathematical models. Object-oriented analysis (OOA) and object-oriented design (OOD) methodologies were employed for the SI simulator system development. OOA is concerned with developing software engineering requirements and specifications that are expressed as a system's object model (which is composed of a population of interacting objects), as opposed to the traditional data or functional views of systems. OOD techniques are useful for the development of large complex systems. Also, OOA/OOD methodology is usually employed to maximize the reusability and extensibility of a software system. In this paper, we present a design feature for the SI simulator software system by the using methodologies of OOA and OOD

  18. A methodology for elemental and organic carbon emission inventory and results for Lombardy region, Italy

    Energy Technology Data Exchange (ETDEWEB)

    Caserini, Stefano [Politecnico di Milano, DICA Environmental Engineering Section, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Galante, Silvia, E-mail: silvia1.galante@polimi.it [Politecnico di Milano, DICA Environmental Engineering Section, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Ozgen, Senem; Cucco, Sara; Gregorio, Katia de [Politecnico di Milano, DICA Environmental Engineering Section, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Moretti, Marco [Environmental Protection Agency of Lombardia Region, ARPA, 20124 Milano (Italy)

    2013-04-15

    This paper presents a methodology and its application for the compilation of elemental carbon (EC) and organic carbon (OC) emission inventories. The methodology consists of the estimation of EC and OC emissions from available total suspended particulate matter (TSP) emission inventory data using EC and OC abundances in TSP derived from an extensive literature review, by taking into account the local technological context. In particular, the method is applied to the 2008 emissions of Lombardy region, Italy, considering 148 different activities and 30 types of fuels, typical of Western Europe. The abundances estimated in this study may provide a useful basis to assess the emissions also in other emission contexts with similar prevailing sources and technologies. The dominant sources of EC and OC in Lombardy are diesel vehicles for EC and the residential wood combustion (RWC) for OC which together account for about 83% of the total emissions of both pollutants. The EC and OC emissions from industrial processes and other fuel (e.g., gasoline, kerosene and LPG) combustion are significantly lower, while non-combustion sources give an almost negligible contribution. Total EC + OC contribution to regional greenhouse gas emissions is positive for every sector assuming whichever GWP100 value within the range proposed in literature. An uncertainty assessment is performed through a Monte Carlo simulation for RWC, showing a large uncertainty range (280% of the mean value for EC and 70% for OC), whereas for road transport a qualitative analysis identified a narrower range of uncertainty. - Highlights: ► Diesel and wood combustion contribute to more than 80% of total EC and OC. ► More than 50% of EC emissions come from road transport. ► Monte Carlo method is used to assess the uncertainty of wood combustion emissions. ► Residential wood combustion is the main source of uncertainty of EC OC inventory. ► In terms of CO{sub 2}eq, EC and OC correspond to 3% of CO{sub 2

  19. A methodology for Manufacturing Execution Systems (MES) implementation

    Science.gov (United States)

    Govindaraju, Rajesri; Putra, Krisna

    2016-02-01

    Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.

  20. A Methodology for Engineering Competencies Definition in the Aerospace Industry

    Directory of Open Access Journals (Sweden)

    Laura Fortunato

    2011-10-01

    Full Text Available The need to cut off lead times, to increase the products innovation, to respond to changing customer requirements and to integrate new technologies into business process pushes companies to increase the collaboration. In particular, collaboration, knowledge sharing and information exchange in the Aerospace Value Network, need to a clear definition and identification of competencies of several actors. Main contractors, stakeholders, customers, suppliers, partners, have different expertise and backgrounds and in this collaborative working environment are called to work together in projects, programs and process. To improve collaboration and support the knowledge sharing, a competencies definition methodology and the related dictionary result useful tools among actors within an extended supply chain. They can use the same terminology and be informed on the competencies available. It becomes easy to specify who knows to do required activities stimulating collaboration and improving communication. Based on an action research developed in the context of the iDesign Foundation project, the paper outlines a competency definition methodology and it presents examples from the implementation in Alenia Aeronautica company. A new definition of competency is suggested supporting by a new method to specify the structural relationship between competencies and activities of aeronautical processes.

  1. Computer Class Role Playing Games, an innovative teaching methodology based on STEM and ICT: first experimental results

    Science.gov (United States)

    Maraffi, S.

    2016-12-01

    Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".

  2. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry

    International Nuclear Information System (INIS)

    Brady, S. L.; Kaufman, R. A.

    2012-01-01

    Purpose: The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ∼25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. Methods: The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Results: Calibration precision was measured to be better than 5%–7%, 3%–5%, and 2%–4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy −1 versus the CT scatter phantom 29.2 ± 1.0 mV cGy −1 and FIA with x-ray 29.9 ± 1.1 mV cGy −1 methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ∼3000 mV. Conclusions: The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the

  3. A Scoping Review of Clinical Practice Improvement Methodology Use in Rehabilitation

    Directory of Open Access Journals (Sweden)

    Marie-Eve Lamontagne

    2016-01-01

    Full Text Available Context The Clinical Practice Improvement (CPI approach is a methodological and quality improvement approach that has emerged and is gaining in popularity. However, there is no systematic description of its use or the determinants of its practice in rehabilitation settings. Method We performed a scoping review of the use of CPI methodology in rehabilitation settings. Results A total of 103 articles were reviewed. We found evidence of 13 initiatives involving CPI with six different populations. A total of 335 citations of determinants were found, with 68.7% related to CPI itself. Little information was found about what type of external and internal environment, individual characteristics and implementation process might facilitate or hinder the use of CPI. Conclusion Given the growing popularity of this methodological approach, CPI initiatives would gain from increasing knowledge of the determinants of its success and incorporating them in future implementation.

  4. Risk management methodology for RBMN project

    International Nuclear Information System (INIS)

    Borssatto, Maria F.B.; Tello, Cledola C.O.; Uemura, George

    2013-01-01

    RBMN Project has been developed to design, construct and commission a national repository to dispose the low- and intermediate-level radioactive wastes from the operation of nuclear power plants and other industries that use radioactive sources and materials. Risk is a characteristic of all projects. The risks arise from uncertainties due to assumptions associated with the project and the environment in which it is executed. Risk management is the method by which these uncertainties are systematically monitored to ensure that the objectives of the project will be achieved. Considering the peculiarities of the Project, that is, comprehensive scope, multidisciplinary team, apparently polemic due to the unknowing of the subject by the stake holders, especially the community, it is being developed a specific methodology for risk management of this Project. This methodology will be critical for future generations who will be responsible for the final stages of the repository. It will provide greater guarantee to the processes already implemented and will maintain a specific list of risks and solutions for this Project, ensuring safety and security of the repository throughout its life cycle that is the planned to last at least three hundred years. This paper presents the tools and processes already defined, management actions aimed at developing a culture of proactive risk in order to minimize threats to this Project and promote actions that bring opportunities to its success. The methodology is based on solid research on the subject, considering methodologies already established and globally recognized as best practices for project management. (author)

  5. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  6. Methodological Behaviorism from the Standpoint of a Radical Behaviorist.

    Science.gov (United States)

    Moore, J

    2013-01-01

    Methodological behaviorism is the name for a prescriptive orientation to psychological science. Its first and original feature is that the terms and concepts deployed in psychological theories and explanations should be based on observable stimuli and behavior. I argue that the interpretation of the phrase "based on" has changed over the years because of the influence of operationism. Its second feature, which developed after the first and is prominent in contemporary psychology, is that research should emphasize formal testing of a theory that involves mediating theoretical entities from an nonbehavioral dimension according to the hypothetico-deductive method. I argue that for contemporary methodological behaviorism, explanations of the behavior of both participants and scientists appeal to the mediating entities as mental causes, if only indirectly. In contrast to methodological behaviorism is the radical behaviorism of B. F. Skinner. Unlike methodological behaviorism, radical behaviorism conceives of verbal behavior in terms of an operant process that involves antecedent circumstances and reinforcing consequences, rather than in terms of a nonbehavioral process that involves reference and symbolism. In addition, radical behaviorism recognizes private behavioral events and subscribes to research and explanatory practices that do not include testing hypotheses about supposed mediating entities from another dimension. I conclude that methodological behaviorism is actually closer to mentalism than to Skinner's radical behaviorism.

  7. IMPROVING METHODOLOGY OF RISK IDENTIFICATION OF OCCUPATIONAL DANGEROUS

    Directory of Open Access Journals (Sweden)

    A. P. BOCHKOVSKYI

    2018-04-01

    Full Text Available In the paper, according to the analysis of statistical data, correlation between the amount of occupational injuries and occupationaldiseases in Ukraine within last 5 years is defined. Also, using methodology of the International Labor Organizationcorrelcation between the amount of accident fatalities and general number of accidents in Ukraine and EU countries (Austria, GreatBritain, Germany, Denmark, Norway, Poland, Hungry, Finland, France is defined. It is shown that in spite of the positive dynamicsof decreasing amount of occupational injuries, the number of occupational diseases in Ukraine always increases. The comparativeanalysis of the ratio of the number of accident fatalities to the total number of registered accidents showed that, on average, Ukraineexceeds the EU countries by this indicator by 100 times.It is noted, that such negative indicators (in particular, increasing amount of occupational diseases, may occure because ofimperfect methodology for identifying the risks of professional dangerous.Also, it is ascertained that basing on the existed methodology, the identefication process of occupational dangerous isquite subjective, which reduces objectivity of conducting quantitative assessment. In order to eliminate defined drawnbacks it is firsttime proposed to use corresponding integral criterion to conduct the process of quantitative risk assessmentTo solve this problem authors formulate and propose an algorithm of improving methodology of a process of analysing dangerousand harmful production effects (DHPE which are the mainest reasons of occupational dangerous.The proposed algorithm includes implementation of four following successive steps: DHPE identification, indication of theirmaximum allowed threshold of concentrations (levels, identification of the sources of identified DHPE, esimation of consequencesof manifestation.The improved proposed methodology allows indentify risks of occurrence occupational dangerous in systems

  8. A generic semi-implicit coupling methodology for use in RELAP5-3Dcopyright

    International Nuclear Information System (INIS)

    Aumiller, D.L.; Tomlinson, E.T.; Weaver, W.L.

    2000-01-01

    A generic semi-implicit coupling methodology has been developed and implemented in the RELAP5-3Dcopyright computer program. This methodology allows RELAP5-3Dcopyright to be used with other computer programs to perform integrated analyses of nuclear power reactor systems and related experimental facilities. The coupling methodology potentially allows different programs to be used to model different portions of the system. The programs are chosen based on their capability to model the phenomena that are important in the simulation in the various portions of the system being considered. The methodology was demonstrated using a test case in which the test geometry was divided into two parts each of which was solved as a RELAP5-3Dcopyright simulation. This test problem exercised all of the semi-implicit coupling features which were installed in RELAP5-3D0. The results of this verification test case show that the semi-implicit coupling methodology produces the same answer as the simulation of the test system as a single process

  9. Optimization of GMAW process of AA 6063-T5 aluminum alloy butt joints based on the response surface methodology and on the bead geometry

    International Nuclear Information System (INIS)

    Miguel, V.; Martinez-Conesa, E. J.; Segura, F.; Manjabacas, M. C.; Abellan, E.

    2012-01-01

    The geometry of the weld beads is characterized by the overhead, the width and the penetration. These values are indices of the behavior of the welded joint and therefore, they can be considered as factors that control the process. This work is performed to optimize the GMAW process of the aluminum alloy AA 6063-T5 by means of the response surface methodology (RSM). The variables herein considered are the arc voltage, the welding speed, the wire feed speed and the separation between surfaces in butt joints. The response functions that are herein studied are the overhead, the width, the penetration and the angle of the bead. The obtained results by RSM show high grade of agreement with the experimental values. The procedure is experimentally validated by welding for the theoretically obtained optimized technological conditions and a wide agreement between theoretical and experimental values is found. (Author) 16 refs.

  10. Methodological issues of genetic association studies.

    Science.gov (United States)

    Simundic, Ana-Maria

    2010-12-01

    Genetic association studies explore the association between genetic polymorphisms and a certain trait, disease or predisposition to disease. It has long been acknowledged that many genetic association studies fail to replicate their initial positive findings. This raises concern about the methodological quality of these reports. Case-control genetic association studies often suffer from various methodological flaws in study design and data analysis, and are often reported poorly. Flawed methodology and poor reporting leads to distorted results and incorrect conclusions. Many journals have adopted guidelines for reporting genetic association studies. In this review, some major methodological determinants of genetic association studies will be discussed.

  11. Application of six sigma DMAIC methodology to reduce service resolution time in a service organization

    Directory of Open Access Journals (Sweden)

    Virender Narula

    2017-11-01

    Full Text Available The popularity of Six Sigma, as a means for improving quality, has grown exponentially in recent years. It is a proven methodology to achieve breakthrough improvement in process performance that generates significant savings to bottom line of an organization. This paper illustrates how Six Sigma methodology may be used to improve service processes. The purpose of this paper is to develop Six Sigma DMAIC methodologies that would help service organizations look into their processes. In addition, it demonstrates the vital linkages between process improvement and process variation. The study identifies critical process parameters and suggests a team structure for Six Sigma project in service operations.

  12. QUALITY IMPROVEMENT IN MULTIRESPONSE EXPERIMENTS THROUGH ROBUST DESIGN METHODOLOGY

    Directory of Open Access Journals (Sweden)

    M. Shilpa

    2012-06-01

    Full Text Available Robust design methodology aims at reducing the variability in the product performance in the presence of noise factors. Experiments involving simultaneous optimization of more than one quality characteristic are known as multiresponse experiments which are used in the development and improvement of industrial processes and products. In this paper, robust design methodology is applied to optimize the process parameters during a particular operation of rotary driving shaft manufacturing process. The three important quality characteristics of the shaft considered here are of type Nominal-the-best, Smaller-the-better and Fraction defective. Simultaneous optimization of these responses is carried out by identifying the control parameters and conducting the experimentation using L9 orthogonal array.

  13. The GPT methodology. New fields of application

    International Nuclear Information System (INIS)

    Gandini, A.; Gomit, J.M.; Abramytchev, V.

    1996-01-01

    The GPT (Generalized Perturbation Theory) methodology is described, and a new application is discussed. The results obtained for a simple model (zero dimension, six parameters of interest) show that the expressions obtained using the GPT methodology, lead to results close to those obtained through direct calculations. The GPT methodology is useful to be used for radioactive waste disposal problems. The potentiality of the method linked to zero dimension model can be extended to radionuclide migration problems with space description. (K.A.)

  14. Risk-based methodology for USNRC inspections

    International Nuclear Information System (INIS)

    Wong, S.M.; Holahan, G.M.; Chung, J.W.; Johnson, M.R.

    1995-01-01

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed

  15. Bioremediation of chlorpyrifos contaminated soil by two phase bioslurry reactor: Processes evaluation and optimization by Taguchi's design of experimental (DOE) methodology.

    Science.gov (United States)

    Pant, Apourv; Rai, J P N

    2018-04-15

    Two phase bioreactor was constructed, designed and developed to evaluate the chlorpyrifos remediation. Six biotic and abiotic factors (substrate-loading rate, slurry phase pH, slurry phase dissolved oxygen (DO), soil water ratio, temperature and soil micro flora load) were evaluated by design of experimental (DOE) methodology employing Taguchi's orthogonal array (OA). The selected six factors were considered at two levels L-8 array (2^7, 15 experiments) in the experimental design. The optimum operating conditions obtained from the methodology showed enhanced chlorpyrifos degradation from 283.86µg/g to 955.364µg/g by overall 70.34% of enhancement. In the present study, with the help of few well defined experimental parameters a mathematical model was constructed to understand the complex bioremediation process and optimize the approximate parameters upto great accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Analyzing parameters optimisation in minimising warpage on side arm using response surface methodology (RSM)

    Science.gov (United States)

    Rayhana, N.; Fathullah, M.; Shayfull, Z.; Nasir, S. M.; Hazwan, M. H. M.

    2017-09-01

    This paper presents a systematic methodology to analyse the warpage of the side arm part using Autodesk Moldflow Insight software. Response Surface Methodology (RSM) was proposed to optimise the processing parameters that will result in optimal solutions by efficiently minimising the warpage of the side arm part. The variable parameters considered in this study was based on most significant parameters affecting warpage stated by previous researchers, that is melt temperature, mould temperature and packing pressure while adding packing time and cooling time as these is the commonly used parameters by researchers. The results show that warpage was improved by 10.15% and the most significant parameters affecting warpage are packing pressure.

  17. Process evaluation of community monitoring under national health mission at Chandigarh, union territory: Methodology and challenges

    Directory of Open Access Journals (Sweden)

    Jaya Prasad Tripathy

    2015-01-01

    Full Text Available Background: Community monitoring was introduced on a pilot mode in 36 selected districts of India in a phased manner. In Chandigarh, it was introduced in the year 2009-2010. A preliminary evaluation of the program was undertaken with special emphasis on the inputs and the processes. Methodology: Quantitative methods included verification against checklists and record reviews. Nonparticipant observation was used to evaluate the conduct of trainings, interviews, and group discussions. Health system had trained health system functionaries (nursing students and Village Health Sanitation Committee [VHSC] members to generate village-based scorecards for assessing community needs. Community needs were assessed independently for two villages under the study area to validate the scores generated by the health system. Results: VHSCs were formed in all 22 villages but without a chairperson or convener. The involvement of VHSC members in the community monitoring process was minimal. The conduct of group discussions was below par due to poor moderation and unequal responses from the group. The community monitoring committees at the state level had limited representation from the non-health sector, lower committees, and the nongovernmental organizations/civil societies. Agreement between the report cards generated by the investigator and the health system in the selected villages was found to be to be fair (0.369 whereas weighted kappa (0.504 was moderate. Conclusion: In spite of all these limitations and challenges, the government has taken a valiant step by trying to involve the community in the monitoring of health services. The dynamic nature of the community warrants incorporation of an evaluation framework into the planning of such programs.

  18. Psychosocial determinants of parental human papillomavirus (HPV vaccine decision-making for sons: Methodological challenges and initial results of a pan-Canadian longitudinal study

    Directory of Open Access Journals (Sweden)

    Samara Perez

    2016-12-01

    Full Text Available Abstract Background HPV vaccination decision-making is a complex process that is influenced by multiple psychosocial determinants. Given the change in policy recommendation to include males in routine HPV vaccination, our goals were to assess the HPV vaccination uptake in Canada, to understand where Canadian parents were situated in the HPV vaccine decision-making process for their son, how they changed over time and which psychosocial determinants were relevant for this process. Methods We used an online survey methodology and collected data from a nationally representative sample of Canadian parents of boys aged 9–16 at baseline (T1, February 2014 and at 9 months’ follow-up (T2. Our analyses were guided by the Precaution Adoption Process Model (PAPM, a theoretical health behavior model that classifies parents in one of six stages: unaware, unengaged, undecided, decided not to vaccinate, decided to vaccinate and those who had already vaccinated their sons. Rigorous methods were used to filter out careless responders: response variance, bogus items, psychometric antonyms and psychometric synonyms. Results At T1 and T2, we received 3,784 and 1,608 respectively completed questionnaires; after data cleaning 3,117 (T1 and 1,427 (T2 were retained. Less than 3% of boys were vaccinated at both time points. At both T1 and T2, most parents (over 70% belonged to the earlier vaccination adoption stages: 57% were unaware (T1 and 15.3% (T2; 20.9% were unengaged (T1 and 32.4% (T2; and 9.1% were undecided (T1 and 25.2% (T2. At follow-up, 37.7% of participants did not move from their initial PAPM decision-making stage. Most parents (55% preferred to receive information from their healthcare provider (HCP but only 6% (T1 and 12% (T2 had actually spoken with a HCP about the HPV vaccine for their son. Conclusions HPV vaccination uptake in Canadian boys was very low in the absence of a publicly funded HPV vaccination programs for boys. Optimal HPV information

  19. Enzymatic Phorbol Esters Degradation using the Germinated Jatropha Curcas Seed Lipase as Biocatalyst: Optimization Process Conditions by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Avita Kusuma Wardhani

    2016-10-01

    Full Text Available Utilization of Jatropha curcas seed cake is limited by the presence of phorbol esters (PE, which are the main toxic compound and heat stable. The objective of this research was to optimize the reaction conditions of the enzymatic PE degradation of the defatted Jatropha curcas seed cake (DJSC using t